Skip to content
代码片段 群组 项目
未验证 提交 d98c63c0 编辑于 作者: Leaminn Ma's avatar Leaminn Ma 提交者: GitLab
浏览文件

Merge branch '477258-add-additional-context-to-AiChatInput' into 'master'

Add additional context to AiChat GraphQL mutation

See merge request https://gitlab.com/gitlab-org/gitlab/-/merge_requests/161898



Merged-by: default avatarLeaminn Ma <lma@gitlab.com>
Approved-by: default avatarShinya Maeda <shinya@gitlab.com>
Reviewed-by: default avatarShinya Maeda <shinya@gitlab.com>
Reviewed-by: default avatarTetiana Chupryna <tchupryna@gitlab.com>
Co-authored-by: default avatarMissy Davies <mdavies@gitlab.com>
Co-authored-by: default avatarVitali Tatarintev <vtatarintev@gitlab.com>
No related branches found
No related tags found
无相关合并请求
显示
181 个添加12 个删除
...@@ -34641,6 +34641,15 @@ Action to subscribe to. ...@@ -34641,6 +34641,15 @@ Action to subscribe to.
| ----- | ----------- | | ----- | ----------- |
| <a id="aiactionchat"></a>`CHAT` | Chat action. | | <a id="aiactionchat"></a>`CHAT` | Chat action. |
   
### `AiAdditionalContextType`
The type of additional context.
| Value | Description |
| ----- | ----------- |
| <a id="aiadditionalcontexttypefile"></a>`FILE` | File content type. |
| <a id="aiadditionalcontexttypesnippet"></a>`SNIPPET` | Snippet content type. |
### `AiMessageRole` ### `AiMessageRole`
   
Possible message roles for AI features. Possible message roles for AI features.
...@@ -39880,12 +39889,23 @@ be used as arguments). ...@@ -39880,12 +39889,23 @@ be used as arguments).
Only general use input types are listed here. For mutation input types, Only general use input types are listed here. For mutation input types,
see the associated mutation type above. see the associated mutation type above.
   
### `AiAdditionalContextInput`
#### Arguments
| Name | Type | Description |
| ---- | ---- | ----------- |
| <a id="aiadditionalcontextinputcontent"></a>`content` | [`String!`](#string) | Content of the additional context. |
| <a id="aiadditionalcontextinputname"></a>`name` | [`String!`](#string) | Name of the additional context. |
| <a id="aiadditionalcontextinputtype"></a>`type` | [`AiAdditionalContextType!`](#aiadditionalcontexttype) | Type of the additional context. |
### `AiChatInput` ### `AiChatInput`
   
#### Arguments #### Arguments
   
| Name | Type | Description | | Name | Type | Description |
| ---- | ---- | ----------- | | ---- | ---- | ----------- |
| <a id="aichatinputadditionalcontext"></a>`additionalContext` | [`[AiAdditionalContextInput!]`](#aiadditionalcontextinput) | Additional context to be passed for the chat. |
| <a id="aichatinputagentversionid"></a>`agentVersionId` | [`AiAgentVersionID`](#aiagentversionid) | Global ID of the agent version to answer the chat. | | <a id="aichatinputagentversionid"></a>`agentVersionId` | [`AiAgentVersionID`](#aiagentversionid) | Global ID of the agent version to answer the chat. |
| <a id="aichatinputcontent"></a>`content` | [`String!`](#string) | Content of the message. | | <a id="aichatinputcontent"></a>`content` | [`String!`](#string) | Content of the message. |
| <a id="aichatinputcurrentfile"></a>`currentFile` | [`AiCurrentFileInput`](#aicurrentfileinput) | Information about currently selected text which can be passed for additional context. | | <a id="aichatinputcurrentfile"></a>`currentFile` | [`AiCurrentFileInput`](#aicurrentfileinput) | Information about currently selected text which can be passed for additional context. |
...@@ -102,6 +102,8 @@ def extract_method_params!(attributes) ...@@ -102,6 +102,8 @@ def extract_method_params!(attributes)
method = methods.each_key.first method = methods.each_key.first
method_arguments = options.merge(methods[method]) method_arguments = options.merge(methods[method])
method_arguments.delete(:additional_context) if Feature.disabled?(:duo_additional_context, current_user)
[method_arguments.delete(:resource_id), method, method_arguments] [method_arguments.delete(:resource_id), method, method_arguments]
end end
end end
......
# frozen_string_literal: true
module Types
module Ai
class AdditionalContextInputType < BaseInputObject
graphql_name 'AiAdditionalContextInput'
MAX_BODY_SIZE = ::API::CodeSuggestions::MAX_BODY_SIZE
MAX_CONTEXT_NAME_SIZE = ::API::CodeSuggestions::MAX_CONTEXT_NAME_SIZE
argument :type, Types::Ai::AdditionalContextTypeEnum,
required: true,
description: 'Type of the additional context.'
argument :name, GraphQL::Types::String,
required: true,
description: 'Name of the additional context.',
validates: { length: { maximum: MAX_CONTEXT_NAME_SIZE } }
argument :content, GraphQL::Types::String,
required: true,
description: 'Content of the additional context.',
validates: { length: { maximum: MAX_BODY_SIZE } }
end
end
end
# frozen_string_literal: true
module Types
module Ai
class AdditionalContextTypeEnum < BaseEnum
graphql_name 'AiAdditionalContextType'
description 'The type of additional context'
::CodeSuggestions::Prompts::CodeGeneration::AnthropicMessages::CONTENT_TYPES.each_value do |type|
value type.upcase, description: "#{type.capitalize} content type.", value: type
end
end
end
end
...@@ -28,6 +28,10 @@ class ChatInputType < BaseMethodInputType ...@@ -28,6 +28,10 @@ class ChatInputType < BaseMethodInputType
argument :current_file, ::Types::Ai::CurrentFileInputType, argument :current_file, ::Types::Ai::CurrentFileInputType,
required: false, required: false,
description: 'Information about currently selected text which can be passed for additional context.' description: 'Information about currently selected text which can be passed for additional context.'
argument :additional_context, [::Types::Ai::AdditionalContextInputType],
required: false,
description: 'Additional context to be passed for the chat.'
end end
end end
end end
---
name: duo_additional_context
feature_issue_url: https://gitlab.com/gitlab-org/gitlab/-/issues/477258
introduced_by_url: https://gitlab.com/gitlab-org/gitlab/-/merge_requests/161898
rollout_issue_url: https://gitlab.com/gitlab-org/gitlab/-/issues/477503
milestone: '17.3'
group: group::code creation
type: wip
default_enabled: false
...@@ -124,7 +124,8 @@ def prompt_options ...@@ -124,7 +124,8 @@ def prompt_options
current_resource_params: current_resource_params, current_resource_params: current_resource_params,
current_file_params: current_file_params, current_file_params: current_file_params,
model_metadata: model_metadata_params, model_metadata: model_metadata_params,
single_action_agent: true single_action_agent: true,
additional_context: context.additional_context
} }
end end
......
...@@ -5,14 +5,15 @@ module Llm ...@@ -5,14 +5,15 @@ module Llm
module Chain module Chain
class GitlabContext class GitlabContext
attr_accessor :current_user, :container, :resource, :ai_request, :tools_used, :extra_resource, :request_id, attr_accessor :current_user, :container, :resource, :ai_request, :tools_used, :extra_resource, :request_id,
:current_file, :agent_version :current_file, :agent_version, :additional_context
delegate :current_page_type, :current_page_sentence, :current_page_short_description, delegate :current_page_type, :current_page_sentence, :current_page_short_description,
to: :authorized_resource, allow_nil: true to: :authorized_resource, allow_nil: true
# rubocop:disable Metrics/ParameterLists -- we probably need to rethink this initializer
def initialize( def initialize(
current_user:, container:, resource:, ai_request:, extra_resource: {}, request_id: nil, current_user:, container:, resource:, ai_request:, extra_resource: {}, request_id: nil,
current_file: {}, agent_version: nil current_file: {}, agent_version: nil, additional_context: []
) )
@current_user = current_user @current_user = current_user
@container = container @container = container
...@@ -23,7 +24,9 @@ def initialize( ...@@ -23,7 +24,9 @@ def initialize(
@request_id = request_id @request_id = request_id
@current_file = (current_file || {}).with_indifferent_access @current_file = (current_file || {}).with_indifferent_access
@agent_version = agent_version @agent_version = agent_version
@additional_context = additional_context
end end
# rubocop:enable Metrics/ParameterLists
def resource_serialized(content_limit:) def resource_serialized(content_limit:)
return '' unless authorized_resource return '' unless authorized_resource
......
...@@ -165,7 +165,8 @@ def request_body_chat_2(prompt:, options: {}) ...@@ -165,7 +165,8 @@ def request_body_chat_2(prompt:, options: {})
steps: options[:agent_scratchpad] steps: options[:agent_scratchpad]
}, },
context: options[:current_resource_params], context: options[:current_resource_params],
current_file: options[:current_file_params] current_file: options[:current_file_params],
additional_context: options[:additional_context]
}.compact }.compact
{ {
......
...@@ -43,7 +43,8 @@ def initialize(prompt_message, ai_prompt_class, options = {}) ...@@ -43,7 +43,8 @@ def initialize(prompt_message, ai_prompt_class, options = {})
extra_resource: options.delete(:extra_resource) || {}, extra_resource: options.delete(:extra_resource) || {},
request_id: prompt_message.request_id, request_id: prompt_message.request_id,
current_file: options.delete(:current_file), current_file: options.delete(:current_file),
agent_version: options[:agent_version_id] && ::Ai::AgentVersion.find_by_id(options[:agent_version_id]) agent_version: options[:agent_version_id] && ::Ai::AgentVersion.find_by_id(options[:agent_version_id]),
additional_context: ::CodeSuggestions::Context.new(Array.wrap(options.delete(:additional_context))).trimmed
) )
end end
......
# frozen_string_literal: true
require 'spec_helper'
RSpec.describe GitlabSchema.types['AiAdditionalContextInput'], feature_category: :duo_chat do
include GraphqlHelpers
it { expect(described_class.graphql_name).to eq('AiAdditionalContextInput') }
it 'has the expected fields' do
expected_fields = %w[type name content]
expect(described_class.arguments.keys).to match_array(expected_fields)
end
end
# frozen_string_literal: true
require 'spec_helper'
RSpec.describe GitlabSchema.types['AiAdditionalContextType'], feature_category: :duo_chat do
it 'exposes all additional context types' do
expect(described_class.values.keys).to match_array(%w[FILE SNIPPET])
end
end
...@@ -83,6 +83,7 @@ ...@@ -83,6 +83,7 @@
{ {
prompt: user_input, prompt: user_input,
options: { options: {
additional_context: [],
agent_scratchpad: [], agent_scratchpad: [],
conversation: "", conversation: "",
single_action_agent: true, single_action_agent: true,
...@@ -217,6 +218,7 @@ ...@@ -217,6 +218,7 @@
{ {
prompt: user_input, prompt: user_input,
options: { options: {
additional_context: [],
agent_scratchpad: [], agent_scratchpad: [],
conversation: "", conversation: "",
single_action_agent: true, single_action_agent: true,
......
...@@ -8,10 +8,15 @@ ...@@ -8,10 +8,15 @@
let_it_be(:project) { create(:project, group: group) } let_it_be(:project) { create(:project, group: group) }
let(:resource) { nil } let(:resource) { nil }
let(:ai_request) { instance_double(Gitlab::Llm::Chain::Requests::Anthropic) } let(:ai_request) { instance_double(Gitlab::Llm::Chain::Requests::Anthropic) }
let(:additional_context) do
[
{ type: 'snippet', name: 'hello world', content: 'puts "Hello, world"' }
]
end
subject(:context) do subject(:context) do
described_class.new(current_user: user, container: nil, resource: resource, ai_request: ai_request, described_class.new(current_user: user, container: nil, resource: resource, ai_request: ai_request,
agent_version: instance_double(Ai::AgentVersion)) agent_version: instance_double(Ai::AgentVersion), additional_context: additional_context)
end end
before_all do before_all do
......
...@@ -26,8 +26,20 @@ ...@@ -26,8 +26,20 @@
} }
end end
let(:additional_context) do
[
{ type: 'snippet', name: 'hello world', content: 'puts "Hello, world"' }
]
end
let(:options) do let(:options) do
{ content: content, extra_resource: extra_resource, current_file: current_file, agent_version_id: agent_version.id } {
content: content,
extra_resource: extra_resource,
current_file: current_file,
agent_version_id: agent_version.id,
additional_context: additional_context
}
end end
let(:container) { group } let(:container) { group }
...@@ -39,7 +51,8 @@ ...@@ -39,7 +51,8 @@
request_id: 'uuid', request_id: 'uuid',
ai_request: ai_request, ai_request: ai_request,
current_file: current_file, current_file: current_file,
agent_version: agent_version agent_version: agent_version,
additional_context: additional_context
) )
end end
...@@ -89,7 +102,8 @@ ...@@ -89,7 +102,8 @@
.and_return(response_handler) .and_return(response_handler)
expect(::Gitlab::Llm::Chain::GitlabContext).to receive(:new) expect(::Gitlab::Llm::Chain::GitlabContext).to receive(:new)
.with(current_user: user, container: expected_container, resource: resource, ai_request: ai_request, .with(current_user: user, container: expected_container, resource: resource, ai_request: ai_request,
extra_resource: extra_resource, request_id: 'uuid', current_file: current_file, agent_version: agent_version) extra_resource: extra_resource, request_id: 'uuid', current_file: current_file, agent_version: agent_version,
additional_context: additional_context)
.and_return(context) .and_return(context)
expect(categorize_service).to receive(:execute) expect(categorize_service).to receive(:execute)
expect(::Llm::ExecuteMethodService).to receive(:new) expect(::Llm::ExecuteMethodService).to receive(:new)
...@@ -182,6 +196,16 @@ ...@@ -182,6 +196,16 @@
end end
end end
describe '.initialize' do
subject { described_class.new(prompt_message, nil, **options) }
it 'trims additional context' do
expect(::CodeSuggestions::Context).to receive(:new).with(additional_context).and_call_original
subject
end
end
describe '#execute' do describe '#execute' do
before do before do
allow(Gitlab::Llm::Chain::Requests::AiGateway).to receive(:new).and_return(ai_request) allow(Gitlab::Llm::Chain::Requests::AiGateway).to receive(:new).and_return(ai_request)
...@@ -228,7 +252,7 @@ ...@@ -228,7 +252,7 @@
expect(::Gitlab::Llm::Chain::GitlabContext).to receive(:new) expect(::Gitlab::Llm::Chain::GitlabContext).to receive(:new)
.with(current_user: user, container: expected_container, resource: resource, .with(current_user: user, container: expected_container, resource: resource,
ai_request: ai_request, extra_resource: extra_resource, request_id: 'uuid', ai_request: ai_request, extra_resource: extra_resource, request_id: 'uuid',
current_file: current_file, agent_version: agent_version) current_file: current_file, agent_version: agent_version, additional_context: additional_context)
.and_return(context) .and_return(context)
# This is temporarily commented out due to the following production issue: # This is temporarily commented out due to the following production issue:
# https://gitlab.com/gitlab-com/gl-infra/production/-/issues/18191 # https://gitlab.com/gitlab-com/gl-infra/production/-/issues/18191
...@@ -380,7 +404,7 @@ ...@@ -380,7 +404,7 @@
allow(::Gitlab::Llm::Chain::GitlabContext).to receive(:new) allow(::Gitlab::Llm::Chain::GitlabContext).to receive(:new)
.with(current_user: user, container: expected_container, resource: resource, ai_request: ai_request, .with(current_user: user, container: expected_container, resource: resource, ai_request: ai_request,
extra_resource: extra_resource, request_id: 'uuid', current_file: current_file, extra_resource: extra_resource, request_id: 'uuid', current_file: current_file,
agent_version: agent_version) agent_version: agent_version, additional_context: additional_context)
.and_return(context) .and_return(context)
expect(categorize_service).not_to receive(:execute) expect(categorize_service).not_to receive(:execute)
...@@ -413,7 +437,7 @@ ...@@ -413,7 +437,7 @@
expect(::Gitlab::Llm::Chain::GitlabContext).to receive(:new) expect(::Gitlab::Llm::Chain::GitlabContext).to receive(:new)
.with(current_user: user, container: expected_container, resource: resource, ai_request: ai_request, .with(current_user: user, container: expected_container, resource: resource, ai_request: ai_request,
extra_resource: extra_resource, request_id: 'uuid', current_file: current_file, extra_resource: extra_resource, request_id: 'uuid', current_file: current_file,
agent_version: agent_version) agent_version: agent_version, additional_context: additional_context)
.and_return(context) .and_return(context)
expect(categorize_service).to receive(:execute) expect(categorize_service).to receive(:execute)
expect(::Llm::ExecuteMethodService).to receive(:new) expect(::Llm::ExecuteMethodService).to receive(:new)
......
...@@ -105,4 +105,37 @@ ...@@ -105,4 +105,37 @@
expect(graphql_mutation_response(:ai_action)['errors']).to eq([]) expect(graphql_mutation_response(:ai_action)['errors']).to eq([])
end end
end end
context 'when additional_context is present' do
let(:additional_context) do
[
{ type: 'SNIPPET', name: 'hello world', content: 'puts "Hello, world"' }
]
end
let(:expected_additional_context) do
[
{ type: 'snippet', name: 'hello world', content: 'puts "Hello, world"' }
]
end
let(:params) do
{ chat: { resource_id: resource&.to_gid, content: "summarize", additional_context: additional_context } }
end
it 'successfully performs a chat request' do
expect(Llm::CompletionWorker).to receive(:perform_for).with(
an_object_having_attributes(
user: current_user,
resource: resource,
ai_action: :chat,
content: "summarize"),
hash_including(additional_context: expected_additional_context)
)
post_graphql_mutation(mutation, current_user: current_user)
expect(graphql_mutation_response(:ai_action)['errors']).to eq([])
end
end
end end
0% 加载中 .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册