@@ -181,10 +181,9 @@ Therefore, a different setup is required from the [SaaS-only AI features](#test-
1. Ensure that the following environment variables are set in the `.env` file:
```shell
AUTH_BYPASS_EXTERNAL=true
ANTHROPIC_API_KEY="[REDACTED]" # IMPORTANT: Ensure you use Corp account. See https://gitlab.com/gitlab-org/gitlab/-/issues/435911#note_1701762954.
PALM_TEXT_MODEL_NAME=text-bison
PALM_TEXT_PROJECT="[REDACTED]"
AIGW_AUTH__BYPASS_EXTERNAL=true
ANTHROPIC_API_KEY="[REDACTED]" # IMPORTANT: Ensure you use Corp account. See https://gitlab.com/gitlab-org/gitlab/-/issues/435911#note_1701762954.
AIGW_VERTEX_TEXT_MODEL__PROJECT="[REDACTED]"
```
1. Run `poetry run ai_gateway`.
...
...
@@ -215,7 +214,7 @@ Therefore, a different setup is required from the [SaaS-only AI features](#test-
1. Create a dummy access token via `gdk rails console` OR skip this step and setup GitLab or Customer Dot as OIDC provider (See the following section):
```ruby
# Creating dummy token, and this will work as long as `AUTH_BYPASS_EXTERNAL=true` in AI Gateway.
# Creating dummy token, and this will work as long as `AIGW_AUTH__BYPASS_EXTERNAL=true` in AI Gateway.
This should enable everyone to see locally any change in an IDE being sent to the main application transformed to a prompt which is then sent to the respective model.
...
...
@@ -29,21 +29,21 @@ This should enable everyone to see locally any change in an IDE being sent to th
1. Run `bundle exec rails c` to start a Rails console
1. Call `Feature.enable(:code_suggestions_tokens_api)` from the console
1. Run the GDK with ```export CODE_SUGGESTIONS_BASE_URL=http://localhost:5052```
1.[Setup Model Gateway](https://gitlab.com/gitlab-org/modelops/applied-ml/code-suggestions/ai-assist#how-to-run-the-server-locally)
1.[Setup AI Gateway](https://gitlab.com/gitlab-org/modelops/applied-ml/code-suggestions/ai-assist#how-to-run-the-server-locally)
1. Build tree sitter libraries ```poetry run scripts/build-tree-sitter-lib.py```
1. Extra .env Changes for all debugging insights
1.LOG_LEVEL=DEBUG
1.LOG_FORMAT_JSON=false
1.LOG_TO_FILE=true
1. Extra .env changes for all debugging insights
1.`AIGW_LOGGING__LEVEL=DEBUG`
1.`AIGW_LOGGING__FORMAT_JSON=false`
1.`AIGW_LOGGING__TO_FILE=true`
1. Watch the new log file ```modelgateway_debug.log``` , e.g. ```tail -f modelgateway_debug.log | fblog -a prefix -a suffix -a current_file_name -a suggestion -a language -a input -a parameters -a score -a exception```
### Setup instructions to use staging Model Gateway
### Setup instructions to use staging AI Gateway
When testing interactions with the Model Gateway, you might want to integrate your local GDK
with the deployed staging Model Gateway. To do this:
When testing interactions with the AI Gateway, you might want to integrate your local GDK
with the deployed staging AI Gateway. To do this:
1. You need a [cloud staging license](../../user/project/repository/code_suggestions/self_managed.md#upgrade-gitlab) that has the Code Suggestions add-on, because add-ons are enabled on staging. Drop a note in the `#s_fulfillment` internal Slack channel to request an add-on to your license. See this [handbook page](https://about.gitlab.com/handbook/developer-onboarding/#working-on-gitlab-ee-developer-licenses) for how to request a license for local development.
1. Set environment variables to point customers-dot to staging, and the Model Gateway to staging:
1. Set environment variables to point customers-dot to staging, and the AI Gateway to staging: