This directory contains scripts useful for interacting with Kibana tools in development. Use the node executable and --help flag to learn about how they work:
node scripts/{{script name}} --helpThis directory is excluded from the build and tools within it should help users discover their capabilities. Each script in this directory must:
- require
src/setup_node_envto bootstrap NodeJS environment - call out to source code in the
srcorpackagesdirectories - react to the
--helpflag - run everywhere OR check and fail fast when a required OS or toolchain is not available
node scripts/functional_tests [--config src/platform/test/functional/config.base.js --config test/api_integration/config.js]
Runs all the functional tests: selenium tests and api integration tests. List configs with multiple --config arguments. Uses the @kbn/test library to run Elasticsearch and Kibana servers and tests against those servers, for multiple server+test setups. In particular, calls out to runTests(). Can be run on a single config.
node scripts/functional_tests_server [--config src/platform/test/functional/config.base.js]
Starts just the Elasticsearch and Kibana servers given a single config, i.e. via --config src/platform/test/functional/config.base.js or --config test/api_integration/config. Allows the user to start just the servers with this script, and keep them running while running tests against these servers. The idea is that the same config file configures both Elasticsearch and Kibana servers. Uses the startServers() method from @kbn/test library.
Example. Start servers and run tests, separately, but using the same config:
# Just the servers
node scripts/functional_tests_server --config path/to/configIn another terminal:
# Just the tests--against the running servers
node scripts/functional_test_runner --config path/to/configFor details on how the internal methods work, read this readme.
If you wish to load up specific es archived data for your test, you can do so via:
node scripts/es_archiver.js load <archive> [--es-url=http://username:password@localhost:9200] [--kibana-url=http://username:password@localhost:5601/{basepath?}]
That will load the specified archive located in the archive directory specified by the default functional config file, located in src/platform/test/functional/config.base.js. To load archives from other function config files you can pass --config path/to/config.js.
Note: The --es-url and --kibana-url options may or may not be neccessary depending on your current Kibana configuration settings, and their values
may also change based on those settings (for example if you are not running with security you will not need the username:password portion).
You can save existing data into an archive by using the save command:
node scripts/es_archiver.js save <archive name for kibana data> [space separated list of index patterns to include]
You may want to store the .kibana index separate from data. Since adding a lot of data will bloat our repo size, we have many tests that reuse the same
data indices but use their own .kibana index.
node scripts/sync_logs.js [options]
Continuously syncs documents from a source Elasticsearch cluster to a destination cluster. Each cycle fetches documents (24h @timestamp window) from the source, optionally rewrites the target index, and bulk-pushes to the destination. Source config via env or CLI; destination uses the same cluster as Kibana (read from config/kibana.dev.yml or --config, with env overrides ELASTICSEARCH_HOST, ELASTICSEARCH_USERNAME, ELASTICSEARCH_PASSWORD), like the otel_demo script. Config is read from process.env (set in the shell or pass inline); CLI flags override env, same as other scripts in this directory (e.g. workflows_import_export.js).
Example with env vars inline:
SOURCE_ELASTICSEARCH_HOST=https://source:9200 SOURCE_ELASTICSEARCH_API_KEY=... node scripts/sync_logs.jsExample with env vars in the shell (set once per session):
export SOURCE_ELASTICSEARCH_HOST=https://source:9200
export SOURCE_ELASTICSEARCH_API_KEY=...
node scripts/sync_logs.jsExample with CLI:
node scripts/sync_logs.js --source-host=https://source:9200 --source-api-key=... --index-pattern="logs*" --size=100 --interval=5Required: SOURCE_ELASTICSEARCH_HOST and SOURCE_ELASTICSEARCH_API_KEY (or --source-host and --source-api-key). Destination is read from Kibana config (default config/kibana.dev.yml) or env; defaults to http://localhost:9200 with basic auth elastic/changeme. Use node scripts/sync_logs.js --help for all options.