Skip to content
代码片段 群组 项目
未验证 提交 123412d9 编辑于 作者: Alexandru Croitor's avatar Alexandru Croitor
浏览文件

Merge branch 'jp-remove-chat-flag' into 'master'

No related branches found
No related tags found
无相关合并请求
---
name: ai_chat_history_context
introduced_by_url: https://gitlab.com/gitlab-org/gitlab/-/merge_requests/122920
rollout_issue_url: https://gitlab.com/gitlab-org/gitlab/-/issues/414606
milestone: '16.1'
type: development
group: group::ai-enablement
default_enabled: true
...@@ -27,8 +27,6 @@ def add_gon_variables ...@@ -27,8 +27,6 @@ def add_gon_variables
gon.payment_form_url = ::Gitlab::Routing.url_helpers.subscription_portal_payment_form_url gon.payment_form_url = ::Gitlab::Routing.url_helpers.subscription_portal_payment_form_url
gon.payment_validation_form_id = ::Gitlab::SubscriptionPortal::PAYMENT_VALIDATION_FORM_ID gon.payment_validation_form_id = ::Gitlab::SubscriptionPortal::PAYMENT_VALIDATION_FORM_ID
end end
push_frontend_feature_flag(:ai_chat_history_context, current_user)
end end
# Exposes if a licensed feature is available. # Exposes if a licensed feature is available.
......
...@@ -148,8 +148,6 @@ def last_conversation ...@@ -148,8 +148,6 @@ def last_conversation
strong_memoize_attr :last_conversation strong_memoize_attr :last_conversation
def conversation def conversation
return [] unless Feature.enabled?(:ai_chat_history_context, context.current_user)
# include only messages with successful response and reorder # include only messages with successful response and reorder
# messages so each question is followed by its answer # messages so each question is followed by its answer
by_request = last_conversation by_request = last_conversation
......
...@@ -16,7 +16,7 @@ describe('AiGenieChat', () => { ...@@ -16,7 +16,7 @@ describe('AiGenieChat', () => {
data = {}, data = {},
scopedSlots = {}, scopedSlots = {},
slots = {}, slots = {},
glFeatures = { aiChatHistoryContext: true }, glFeatures = {},
} = {}) => { } = {}) => {
jest.spyOn(AiGenieLoader.methods, 'computeTransitionWidth').mockImplementation(); jest.spyOn(AiGenieLoader.methods, 'computeTransitionWidth').mockImplementation();
......
...@@ -206,19 +206,6 @@ ...@@ -206,19 +206,6 @@
agent.prompt agent.prompt
end end
context 'when ai_chat_history_context is disabled' do
before do
stub_feature_flags(ai_chat_history_context: false)
end
it 'includes an empty chat' do
expect(Gitlab::Llm::Chain::Agents::ZeroShot::Prompts::Anthropic)
.to receive(:prompt).once.with(a_hash_including(conversation: []))
agent.prompt
end
end
it 'includes the prompt' do it 'includes the prompt' do
expect(Gitlab::Llm::Chain::Agents::ZeroShot::Prompts::Anthropic) expect(Gitlab::Llm::Chain::Agents::ZeroShot::Prompts::Anthropic)
.to receive(:prompt).once.with(a_hash_including(prompt_version: .to receive(:prompt).once.with(a_hash_including(prompt_version:
......
0% 加载中 .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册