Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
Menu
Open sidebar
陈曦
sub2api
Commits
bea1520c
Commit
bea1520c
authored
Apr 13, 2026
by
fjl5
Committed by
陈曦
Apr 17, 2026
Browse files
add prompt_cache_key injection for messages→responses
parent
c0b2cacb
Changes
1
Hide whitespace changes
Inline
Side-by-side
backend/internal/service/openai_gateway_messages.go
View file @
bea1520c
...
...
@@ -121,6 +121,28 @@ func (s *OpenAIGatewayService) ForwardAsAnthropic(
}
}
// For API key accounts (including OpenAI-compatible upstream gateways),
// ensure promptCacheKey is also propagated via the request body so that
// upstreams using the Responses API can derive a stable session identifier
// from prompt_cache_key. This makes our Anthropic /v1/messages compatibility
// path behave more like a native Responses client.
if
account
.
Type
==
AccountTypeAPIKey
{
if
trimmedKey
:=
strings
.
TrimSpace
(
promptCacheKey
);
trimmedKey
!=
""
{
var
reqBody
map
[
string
]
any
if
err
:=
json
.
Unmarshal
(
responsesBody
,
&
reqBody
);
err
!=
nil
{
return
nil
,
fmt
.
Errorf
(
"unmarshal for prompt cache key injection: %w"
,
err
)
}
if
existing
,
ok
:=
reqBody
[
"prompt_cache_key"
]
.
(
string
);
!
ok
||
strings
.
TrimSpace
(
existing
)
==
""
{
reqBody
[
"prompt_cache_key"
]
=
trimmedKey
updated
,
err
:=
json
.
Marshal
(
reqBody
)
if
err
!=
nil
{
return
nil
,
fmt
.
Errorf
(
"remarshal after prompt cache key injection: %w"
,
err
)
}
responsesBody
=
updated
}
}
}
// 5. Get access token
token
,
_
,
err
:=
s
.
GetAccessToken
(
ctx
,
account
)
if
err
!=
nil
{
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment