Skip to content
代码片段 群组 项目
代码所有者
将用户和群组指定为特定文件更改的核准人。 了解更多。
index.md 46.65 KiB
stage: Monitor
group: Respond
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/product/ux/technical-writing/#assignments

Log system (FREE SELF)

GitLab has an advanced log system where everything is logged, so you can analyze your instance using various system log files. The log system is similar to audit events.

System log files are typically plain text in a standard log file format. This guide talks about how to read and use these system log files.

Read more about the log system and using the logs:

Log Levels

Each log message has an assigned log level that indicates its importance and verbosity. Each logger has an assigned minimum log level. A logger emits a log message only if its log level is equal to or above the minimum log level.

The following log levels are supported:

Level Name
0 DEBUG
1 INFO
2 WARN
3 ERROR
4 FATAL
5 UNKNOWN

GitLab loggers emit all log messages because they are set to DEBUG by default.

Override default log level

You can override the minimum log level for GitLab loggers using the GITLAB_LOG_LEVEL environment variable. Valid values are either a value of 0 to 5, or the name of the log level.

Example:

GITLAB_LOG_LEVEL=info

For some services, other log levels are in place that are not affected by this setting. Some of these services have their own environment variables to override the log level. For example:

Service Log level Environment variable
GitLab API INFO
GitLab Cleanup INFO DEBUG
GitLab Doctor INFO VERBOSE
GitLab Export INFO EXPORT_DEBUG
GitLab Geo INFO
GitLab Import INFO IMPORT_DEBUG
GitLab QA Runtime INFO QA_LOG_LEVEL
Google APIs INFO
Rack Timeout ERROR
Sidekiq (server) INFO
Snowplow Tracker FATAL
gRPC Client (Gitaly) WARN GRPC_LOG_LEVEL

Log Rotation

The logs for a given service may be managed and rotated by:

  • logrotate
  • svlogd (runit's service logging daemon)
  • logrotate and svlogd
  • Or not at all

The following table includes information about what's responsible for managing and rotating logs for the included services. Logs managed by svlogd are written to a file called current. The logrotate service built into GitLab manages all logs except those captured by runit.

Log type Managed by logrotate Managed by svlogd/runit
Alertmanager logs {dotted-circle} No {check-circle} Yes
crond logs {dotted-circle} No {check-circle} Yes
Gitaly {check-circle} Yes {check-circle} Yes
GitLab Exporter for Omnibus {dotted-circle} No {check-circle} Yes
GitLab Pages logs {check-circle} Yes {check-circle} Yes
GitLab Rails {check-circle} Yes {dotted-circle} No
GitLab Shell logs {check-circle} Yes {dotted-circle} No
Grafana logs {dotted-circle} No {check-circle} Yes
LogRotate logs {dotted-circle} No {check-circle} Yes
Mailroom {check-circle} Yes {check-circle} Yes
NGINX {check-circle} Yes {check-circle} Yes
PostgreSQL logs {dotted-circle} No {check-circle} Yes
Praefect logs {dotted-circle} Yes {check-circle} Yes
Prometheus logs {dotted-circle} No {check-circle} Yes
Puma {check-circle} Yes {check-circle} Yes
Redis logs {dotted-circle} No {check-circle} Yes
Registry logs {dotted-circle} No {check-circle} Yes
Workhorse logs {check-circle} Yes {check-circle} Yes

production_json.log

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/production_json.log
  • Installations from source: /home/git/gitlab/log/production_json.log

It contains a structured log for Rails controller requests received from GitLab, thanks to Lograge. Requests from the API are logged to a separate file in api_json.log.

Each line contains JSON that can be ingested by services like Elasticsearch and Splunk. Line breaks were added to examples for legibility:

{
  "method":"GET",
  "path":"/gitlab/gitlab-foss/issues/1234",
  "format":"html",
  "controller":"Projects::IssuesController",
  "action":"show",
  "status":200,
  "time":"2017-08-08T20:15:54.821Z",
  "params":[{"key":"param_key","value":"param_value"}],
  "remote_ip":"18.245.0.1",
  "user_id":1,
  "username":"admin",
  "queue_duration_s":0.0,
  "gitaly_calls":16,
  "gitaly_duration_s":0.16,
  "redis_calls":115,
  "redis_duration_s":0.13,
  "redis_read_bytes":1507378,
  "redis_write_bytes":2920,
  "correlation_id":"O1SdybnnIq7",
  "cpu_s":17.50,
  "db_duration_s":0.08,
  "view_duration_s":2.39,
  "duration_s":20.54,
  "pid": 81836,
  "worker_id":"puma_0"
}

This example was a GET request for a specific issue. Each line also contains performance data, with times in seconds:

  • duration_s: Total time to retrieve the request
  • queue_duration_s: Total time the request was queued inside GitLab Workhorse
  • view_duration_s: Total time inside the Rails views
  • db_duration_s: Total time to retrieve data from PostgreSQL
  • cpu_s: Total time spent on CPU
  • gitaly_duration_s: Total time by Gitaly calls
  • gitaly_calls: Total number of calls made to Gitaly
  • redis_calls: Total number of calls made to Redis
  • redis_cross_slot_calls: Total number of cross-slot calls made to Redis
  • redis_allowed_cross_slot_calls: Total number of allowed cross-slot calls made to Redis
  • redis_duration_s: Total time to retrieve data from Redis
  • redis_read_bytes: Total bytes read from Redis
  • redis_write_bytes: Total bytes written to Redis
  • redis_<instance>_calls: Total number of calls made to a Redis instance
  • redis_<instance>_cross_slot_calls: Total number of cross-slot calls made to a Redis instance
  • redis_<instance>_allowed_cross_slot_calls: Total number of allowed cross-slot calls made to a Redis instance
  • redis_<instance>_duration_s: Total time to retrieve data from a Redis instance
  • redis_<instance>_read_bytes: Total bytes read from a Redis instance
  • redis_<instance>_write_bytes: Total bytes written to a Redis instance
  • pid: The worker's Linux process ID (changes when workers restart)
  • worker_id: The worker's logical ID (does not change when workers restart)

User clone and fetch activity using HTTP transport appears in the log as action: git_upload_pack.

In addition, the log contains the originating IP address, (remote_ip), the user's ID (user_id), and username (username).

Some endpoints (such as /search) may make requests to Elasticsearch if using advanced search. These additionally log elasticsearch_calls and elasticsearch_call_duration_s, which correspond to:

  • elasticsearch_calls: Total number of calls to Elasticsearch
  • elasticsearch_duration_s: Total time taken by Elasticsearch calls
  • elasticsearch_timed_out_count: Total number of calls to Elasticsearch that timed out and therefore returned partial results

ActionCable connection and subscription events are also logged to this file and they follow the previous format. The method, path, and format fields are not applicable, and are always empty. The ActionCable connection or channel class is used as the controller.

{
  "method":null,
  "path":null,
  "format":null,
  "controller":"IssuesChannel",
  "action":"subscribe",
  "status":200,
  "time":"2020-05-14T19:46:22.008Z",
  "params":[{"key":"project_path","value":"gitlab/gitlab-foss"},{"key":"iid","value":"1"}],
  "remote_ip":"127.0.0.1",
  "user_id":1,
  "username":"admin",
  "ua":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:76.0) Gecko/20100101 Firefox/76.0",
  "correlation_id":"jSOIEynHCUa",
  "duration_s":0.32566
}

NOTE: Starting with GitLab 12.5, if an error occurs, an exception field is included with class, message, and backtrace. Previous versions included an error field instead of exception.class and exception.message. For example:

{
  "method": "GET",
  "path": "/admin",
  "format": "html",
  "controller": "Admin::DashboardController",
  "action": "index",
  "status": 500,
  "time": "2019-11-14T13:12:46.156Z",
  "params": [],
  "remote_ip": "127.0.0.1",
  "user_id": 1,
  "username": "root",
  "ua": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.14; rv:70.0) Gecko/20100101 Firefox/70.0",
  "queue_duration": 274.35,
  "correlation_id": "KjDVUhNvvV3",
  "queue_duration_s":0.0,
  "gitaly_calls":16,
  "gitaly_duration_s":0.16,
  "redis_calls":115,
  "redis_duration_s":0.13,
  "correlation_id":"O1SdybnnIq7",
  "cpu_s":17.50,
  "db_duration_s":0.08,
  "view_duration_s":2.39,
  "duration_s":20.54,
  "pid": 81836,
  "worker_id": "puma_0",
  "exception.class": "NameError",
  "exception.message": "undefined local variable or method `adsf' for #<Admin::DashboardController:0x00007ff3c9648588>",
  "exception.backtrace": [
    "app/controllers/admin/dashboard_controller.rb:11:in `index'",
    "ee/app/controllers/ee/admin/dashboard_controller.rb:14:in `index'",
    "ee/lib/gitlab/ip_address_state.rb:10:in `with'",
    "ee/app/controllers/ee/application_controller.rb:43:in `set_current_ip_address'",
    "lib/gitlab/session.rb:11:in `with_session'",
    "app/controllers/application_controller.rb:450:in `set_session_storage'",
    "app/controllers/application_controller.rb:444:in `set_locale'",
    "ee/lib/gitlab/jira/middleware.rb:19:in `call'"
  ]
}

production.log

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/production.log
  • Installations from source: /home/git/gitlab/log/production.log

It contains information about all performed requests. You can see the URL and type of request, IP address, and what parts of code were involved to service this particular request. Also, you can see all SQL requests performed, and how much time each took. This task is more useful for GitLab contributors and developers. Use part of this log file when you're reporting bugs. For example:

Started GET "/gitlabhq/yaml_db/tree/master" for 168.111.56.1 at 2015-02-12 19:34:53 +0200
Processing by Projects::TreeController#show as HTML
  Parameters: {"project_id"=>"gitlabhq/yaml_db", "id"=>"master"}

  ... [CUT OUT]

  Namespaces"."created_at" DESC, "namespaces"."id" DESC LIMIT 1 [["id", 26]]
  CACHE (0.0ms) SELECT  "members".* FROM "members"  WHERE "members"."source_type" = 'Project' AND "members"."type" IN ('ProjectMember') AND "members"."source_id" = $1 AND "members"."source_type" = $2 AND "members"."user_id" = 1  ORDER BY "members"."created_at" DESC, "members"."id" DESC LIMIT 1  [["source_id", 18], ["source_type", "Project"]]
  CACHE (0.0ms) SELECT  "members".* FROM "members"  WHERE "members"."source_type" = 'Project' AND "members".
  (1.4ms) SELECT COUNT(*) FROM "merge_requests"  WHERE "merge_requests"."target_project_id" = $1 AND ("merge_requests"."state" IN ('opened','reopened')) [["target_project_id", 18]]
  Rendered layouts/nav/_project.html.haml (28.0ms)
  Rendered layouts/_collapse_button.html.haml (0.2ms)
  Rendered layouts/_flash.html.haml (0.1ms)
  Rendered layouts/_page.html.haml (32.9ms)
Completed 200 OK in 166ms (Views: 117.4ms | ActiveRecord: 27.2ms)

In this example, the server processed an HTTP request with URL /gitlabhq/yaml_db/tree/master from IP 168.111.56.1 at 2015-02-12 19:34:53 +0200. The request was processed by Projects::TreeController.

api_json.log

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/api_json.log
  • Installations from source: /home/git/gitlab/log/api_json.log

It helps you see requests made directly to the API. For example:

{
  "time":"2018-10-29T12:49:42.123Z",
  "severity":"INFO",
  "duration":709.08,
  "db":14.59,
  "view":694.49,
  "status":200,
  "method":"GET",
  "path":"/api/v4/projects",
  "params":[{"key":"action","value":"git-upload-pack"},{"key":"changes","value":"_any"},{"key":"key_id","value":"secret"},{"key":"secret_token","value":"[FILTERED]"}],
  "host":"localhost",
  "remote_ip":"::1",
  "ua":"Ruby",
  "route":"/api/:version/projects",
  "user_id":1,
  "username":"root",
  "queue_duration":100.31,
  "gitaly_calls":30,
  "gitaly_duration":5.36,
  "pid": 81836,
  "worker_id": "puma_0",
  ...
}

This entry shows an internal endpoint accessed to check whether an associated SSH key can download the project in question by using a git fetch or git clone. In this example, we see:

  • duration: Total time in milliseconds to retrieve the request
  • queue_duration: Total time in milliseconds the request was queued inside GitLab Workhorse
  • method: The HTTP method used to make the request
  • path: The relative path of the query
  • params: Key-value pairs passed in a query string or HTTP body (sensitive parameters, such as passwords and tokens, are filtered out)
  • ua: The User-Agent of the requester

NOTE: As of Grape Logging v1.8.4, the view_duration_s is calculated by duration_s - db_duration_s. Therefore, view_duration_s can be affected by multiple different factors, like read-write process on Redis or external HTTP, not only the serialization process.

application.log

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/application.log
  • Installations from source: /home/git/gitlab/log/application.log

It helps you discover events happening in your instance such as user creation and project deletion. For example:

October 06, 2014 11:56: User "Administrator" (admin@example.com) was created
October 06, 2014 11:56: Documentcloud created a new project "Documentcloud / Underscore"
October 06, 2014 11:56: Gitlab Org created a new project "Gitlab Org / Gitlab Ce"
October 07, 2014 11:25: User "Claudie Hodkiewicz" (nasir_stehr@olson.co.uk)  was removed
October 07, 2014 11:25: Project "project133" was removed

application_json.log

Introduced in GitLab 12.7.

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/application_json.log
  • Installations from source: /home/git/gitlab/log/application_json.log

It contains the JSON version of the logs in application.log, like this example:

{
  "severity":"INFO",
  "time":"2020-01-14T13:35:15.466Z",
  "correlation_id":"3823a1550b64417f9c9ed8ee0f48087e",
  "message":"User \"Administrator\" (admin@example.com) was created"
}
{
  "severity":"INFO",
  "time":"2020-01-14T13:35:15.466Z",
  "correlation_id":"78e3df10c9a18745243d524540bd5be4",
  "message":"Project \"project133\" was removed"
}

integrations_json.log

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/integrations_json.log
  • Installations from source: /home/git/gitlab/log/integrations_json.log

It contains information about integration activities, such as Jira, Asana, and irker services. It uses JSON format, like this example:

{
  "severity":"ERROR",
  "time":"2018-09-06T14:56:20.439Z",
  "service_class":"Integrations::Jira",
  "project_id":8,
  "project_path":"h5bp/html5-boilerplate",
  "message":"Error sending message",
  "client_url":"http://jira.gitlap.com:8080",
  "error":"execution expired"
}
{
  "severity":"INFO",
  "time":"2018-09-06T17:15:16.365Z",
  "service_class":"Integrations::Jira",
  "project_id":3,
  "project_path":"namespace2/project2",
  "message":"Successfully posted",
  "client_url":"http://jira.example.com"
}

kubernetes.log (deprecated)

Deprecated in GitLab 14.5.

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/kubernetes.log
  • Installations from source: /home/git/gitlab/log/kubernetes.log

It logs information related to certificate-based clusters, such as connectivity errors. Each line contains JSON that can be ingested by services like Elasticsearch and Splunk.

git_json.log

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/git_json.log
  • Installations from source: /home/git/gitlab/log/git_json.log

After GitLab version 12.2, this file was renamed from githost.log to git_json.log and stored in JSON format.

GitLab has to interact with Git repositories, but in some rare cases something can go wrong. If this happens, you need to know exactly what happened. This log file contains all failed requests from GitLab to Git repositories. In the majority of cases this file is useful for developers only. For example:

{
   "severity":"ERROR",
   "time":"2019-07-19T22:16:12.528Z",
   "correlation_id":"FeGxww5Hj64",
   "message":"Command failed [1]: /usr/bin/git --git-dir=/Users/vsizov/gitlab-development-kit/gitlab/tmp/tests/gitlab-satellites/group184/gitlabhq/.git --work-tree=/Users/vsizov/gitlab-development-kit/gitlab/tmp/tests/gitlab-satellites/group184/gitlabhq merge --no-ff -mMerge branch 'feature_conflict' into 'feature' source/feature_conflict\n\nerror: failed to push some refs to '/Users/vsizov/gitlab-development-kit/repositories/gitlabhq/gitlab_git.git'"
}

audit_json.log (FREE)

NOTE: GitLab Free tracks a small number of different audit events. GitLab Premium tracks many more.

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/audit_json.log
  • Installations from source: /home/git/gitlab/log/audit_json.log

Changes to group or project settings and memberships (target_details) are logged to this file. For example:

{
  "severity":"INFO",
  "time":"2018-10-17T17:38:22.523Z",
  "author_id":3,
  "entity_id":2,
  "entity_type":"Project",
  "change":"visibility",
  "from":"Private",
  "to":"Public",
  "author_name":"John Doe4",
  "target_id":2,
  "target_type":"Project",
  "target_details":"namespace2/project2"
}

Sidekiq logs

NOTE: In Omnibus GitLab 12.10 or earlier, the Sidekiq log is at /var/log/gitlab/gitlab-rails/sidekiq.log.

For Omnibus GitLab installations, some Sidekiq logs are in /var/log/gitlab/sidekiq/current and as follows.

sidekiq.log

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/sidekiq/current
  • Installations from source: /home/git/gitlab/log/sidekiq.log

GitLab uses background jobs for processing tasks which can take a long time. All information about processing these jobs are written down to this file. For example:

2014-06-10T07:55:20Z 2037 TID-tm504 ERROR: /opt/bitnami/apps/discourse/htdocs/vendor/bundle/ruby/1.9.1/gems/redis-3.0.7/lib/redis/client.rb:228:in `read'
2014-06-10T18:18:26Z 14299 TID-55uqo INFO: Booting Sidekiq 3.0.0 with redis options {:url=>"redis://localhost:6379/0", :namespace=>"sidekiq"}

Instead of the previous format, you can opt to generate JSON logs for Sidekiq. For example:

{
  "severity":"INFO",
  "time":"2018-04-03T22:57:22.071Z",
  "queue":"cronjob:update_all_mirrors",
  "args":[],
  "class":"UpdateAllMirrorsWorker",
  "retry":false,
  "queue_namespace":"cronjob",
  "jid":"06aeaa3b0aadacf9981f368e",
  "created_at":"2018-04-03T22:57:21.930Z",
  "enqueued_at":"2018-04-03T22:57:21.931Z",
  "pid":10077,
  "worker_id":"sidekiq_0",
  "message":"UpdateAllMirrorsWorker JID-06aeaa3b0aadacf9981f368e: done: 0.139 sec",
  "job_status":"done",
  "duration":0.139,
  "completed_at":"2018-04-03T22:57:22.071Z",
  "db_duration":0.05,
  "db_duration_s":0.0005,
  "gitaly_duration":0,
  "gitaly_calls":0
}

For Omnibus GitLab installations, add the configuration option:

sidekiq['log_format'] = 'json'

For installations from source, edit the gitlab.yml and set the Sidekiq log_format configuration option:

  ## Sidekiq
  sidekiq:
    log_format: json

sidekiq_client.log

Introduced in GitLab 12.9.

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/sidekiq_client.log
  • Installations from source: /home/git/gitlab/log/sidekiq_client.log

This file contains logging information about jobs before Sidekiq starts processing them, such as before being enqueued.

This log file follows the same structure as sidekiq.log, so it is structured as JSON if you've configured this for Sidekiq as mentioned above.

gitlab-shell.log

GitLab Shell is used by GitLab for executing Git commands and provide SSH access to Git repositories.

For GitLab versions 12.10 and up

Information containing git-{upload-pack,receive-pack} requests is at /var/log/gitlab/gitlab-shell/gitlab-shell.log. Information about hooks to GitLab Shell from Gitaly is at /var/log/gitlab/gitaly/current.

Example log entries for /var/log/gitlab/gitlab-shell/gitlab-shell.log:

{
  "duration_ms": 74.104,
  "level": "info",
  "method": "POST",
  "msg": "Finished HTTP request",
  "time": "2020-04-17T20:28:46Z",
  "url": "http://127.0.0.1:8080/api/v4/internal/allowed"
}
{
  "command": "git-upload-pack",
  "git_protocol": "",
  "gl_project_path": "root/example",
  "gl_repository": "project-1",
  "level": "info",
  "msg": "executing git command",
  "time": "2020-04-17T20:28:46Z",
  "user_id": "user-1",
  "username": "root"
}

Example log entries for /var/log/gitlab/gitaly/current:

{
  "method": "POST",
  "url": "http://127.0.0.1:8080/api/v4/internal/allowed",
  "duration": 0.058012959,
  "gitaly_embedded": true,
  "pid": 16636,
  "level": "info",
  "msg": "finished HTTP request",
  "time": "2020-04-17T20:29:08+00:00"
}
{
  "method": "POST",
  "url": "http://127.0.0.1:8080/api/v4/internal/pre_receive",
  "duration": 0.031022552,
  "gitaly_embedded": true,
  "pid": 16636,
  "level": "info",
  "msg": "finished HTTP request",
  "time": "2020-04-17T20:29:08+00:00"
}

For GitLab versions 12.5 through 12.9

For GitLab 12.5 to 12.9, depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitaly/gitlab-shell.log
  • Installation from source: /home/git/gitaly/gitlab-shell.log

Example log entries:

{
  "method": "POST",
  "url": "http://127.0.0.1:8080/api/v4/internal/post_receive",
  "duration": 0.031809164,
  "gitaly_embedded": true,
  "pid": 27056,
  "level": "info",
  "msg": "finished HTTP request",
  "time": "2020-04-17T16:24:38+00:00"
}

For GitLab 12.5 and earlier

For GitLab 12.5 and earlier, the file is at /var/log/gitlab/gitlab-shell/gitlab-shell.log.

Example log entries:

I, [2015-02-13T06:17:00.671315 #9291]  INFO -- : Adding project root/example.git at </var/opt/gitlab/git-data/repositories/root/dcdcdcdcd.git>.
I, [2015-02-13T06:17:00.679433 #9291]  INFO -- : Moving existing hooks directory and symlinking global hooks directory for /var/opt/gitlab/git-data/repositories/root/example.git.

User clone/fetch activity using SSH transport appears in this log as executing git command <gitaly-upload-pack....

Gitaly logs

This file is in /var/log/gitlab/gitaly/current and is produced by runit. runit is packaged with Omnibus GitLab and a brief explanation of its purpose is available in the Omnibus GitLab documentation. Log files are rotated, renamed in Unix timestamp format, and gzip-compressed (like @1584057562.s).

grpc.log

This file is at /var/log/gitlab/gitlab-rails/grpc.log for Omnibus GitLab packages. Native gRPC logging used by Gitaly.

gitaly_hooks.log

This file is at /var/log/gitlab/gitaly/gitaly_hooks.log and is produced by gitaly-hooks command. It also contains records about failures received during processing of the responses from GitLab API.

Puma logs

puma_stdout.log

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/puma/puma_stdout.log
  • Installations from source: /home/git/gitlab/log/puma_stdout.log

puma_stderr.log

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/puma/puma_stderr.log
  • Installations from source: /home/git/gitlab/log/puma_stderr.log

repocheck.log

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/repocheck.log
  • Installations from source: /home/git/gitlab/log/repocheck.log

It logs information whenever a repository check is run on a project.

importer.log

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/importer.log
  • Installations from source: /home/git/gitlab/log/importer.log

This file logs the progress of project imports and migrations.

exporter.log

Introduced in GitLab 13.1.

Depending on your installation method, this file is located at:

  • Omnibus GitLab: /var/log/gitlab/gitlab-rails/exporter.log
  • Installations from source: /home/git/gitlab/log/exporter.log

It logs the progress of the export process.

features_json.log