983ca8992e
* This commit enables the router to forward form-data to model server. Fixes #22044 (enabling to use the /v1/audio/transcriptions in router mode) * * Applied the suggestion from Copilots first comment: using the non-throwing json::parse overload. * Addressed Copilots third comment by extending the files representation to also include filename and content-type * Addressed Copilots fourth comment by making the RNG thread_local * Changed variable body from std::string to std::ostringstream in build_multipart_body as suggested by ngxson in https://github.com/ggml-org/llama.cpp/pull/22118#discussion_r3127099053 * Added sanitize_field lambda in build_multipart_body for key, filename and content_type as suggested by ngxson in https://github.com/ggml-org/llama.cpp/pull/22118#discussion_r3127104647 * explicitly checking if value/item is string before calling value/item.get<std::string>() as requested by ngxson in https://github.com/ggml-org/llama.cpp/pull/22118#discussion_r3127111279 * Added double quote to the sanitize lambda and throw on json parse failure --------- Co-authored-by: Ralph Paßgang <ralph@trust-it.de>
27 lines
906 B
C++
27 lines
906 B
C++
// Chat conversion functions for server (Responses API, Anthropic API, OAI streaming diffs)
|
|
|
|
#pragma once
|
|
|
|
#include "chat.h"
|
|
#include "server-common.h"
|
|
#include "server-http.h"
|
|
|
|
#include <nlohmann/json_fwd.hpp>
|
|
|
|
using json = nlohmann::ordered_json;
|
|
|
|
// Convert OpenAI Responses API format to OpenAI Chat Completions API format
|
|
json server_chat_convert_responses_to_chatcmpl(const json & body);
|
|
|
|
// Convert Anthropic Messages API format to OpenAI Chat Completions API format
|
|
json server_chat_convert_anthropic_to_oai(const json & body);
|
|
|
|
// convert OpenAI transcriptions API format to OpenAI Chat Completions API format
|
|
json convert_transcriptions_to_chatcmpl(
|
|
const json & body,
|
|
const common_chat_templates * tmpls,
|
|
const std::map<std::string, uploaded_file> & in_files,
|
|
std::vector<raw_buffer> & out_files);
|
|
|
|
json server_chat_msg_diff_to_json_oaicompat(const common_chat_msg_diff & diff);
|