azimutt
v0.1.32
Published
Export database schema from relational or document databases. Import it to https://azimutt.app
Downloads
171
Maintainers
Readme
Azimutt CLI ease your work with databases 😎
It works with: PostgreSQL, MySQL, MariaDB, SQL Server, Oracle, MongoDB, Couchbase, Snowflake, BigQuery (can be extended on demand).
It's a toolbox to interact with those databases but also AML, here are the main features:
- explore: explore your databases schema and data
- analyze: analyze your database and make recommendations
- gateway: launch a Node.js server allowing Azimutt to connect to your databases
- convert: convert database schema dialects from one to another (AML, SQL, Mermaid, Markdown...)
To use it, you need npm, you can install it (npm install -g azimutt
) or launch it directly (npx azimutt@latest <command> <args>
).
CLI Commands
Explore
This one is just a shortcut to start the gateway (like the gateway command) and open Azimutt with your url to explore your database, easy-peasy!
npx azimutt@latest explore <db_url>
Options:
--instance <instance>
: select the Azimutt instance to open (by default it will be https://azimutt.app)
Analyze
Connect to your database, extract the schema, statistics and queries to run some analyses and recommend improvement actions.
npx azimutt@latest analyze <db_url>
You can see this as a database linter. The first time it will write a config file (by default in ~/.azimutt/analyze/$db_name/conf.json
) you can adjust later.
Options:
--folder <folder>
: use a specific folder for configuration and report files--only <rule_ids>
: limit the used rules--size <number>
: how many violations are shown for each rules--ignore-violations-from <folder>
: ignores all the violations already reported in this given folder--email <email>
: your email, unlocks writing the report as JSON--key <key>
: unlocks trend rules, as us for a key
Gateway
Launch the gateway server, it acts as a bridge between Azimutt frontend and your database (convert HTTP queries to SQL ones ^^).
npx azimutt@latest gateway
Export
Export your database schema as JSON, can be imported into Azimutt.
It's convenient to check what you upload to Azimutt (even if everything stay on your browser until you save and choose).
npx azimutt@latest export <db_url>
Sample urls:
- PostgreSQL:
postgresql://postgres:postgres@localhost:5432/azimutt_dev
- MySQL:
mysql://user:[email protected]:3306/my_db
- MariaDB:
mariadb://user:[email protected]:3306/my_db
- SQL Server:
Server=host.com,1433;Database=db;User Id=user;Password=pass
- Oracle:
oracle:thin:system/oracle@localhost:1521/FREE
- MongoDB:
"mongodb+srv://user:[email protected]"
- Couchbase:
couchbases://cb.gfn6dh493pmfh613v.cloud.couchbase.com
- Snowflake:
snowflake://user:[email protected]?db=my_db
- BigQuery:
bigquery://bigquery.googleapis.com/my-project?key=key.json
Options:
--database
: restrict extraction to this database or database pattern (uses LIKE pattern with %)--catalog
: restrict extraction to this catalog or catalog pattern (uses LIKE pattern with %)--bucket
: restrict extraction to this bucket or bucket pattern (uses LIKE pattern with %)--schema
: restrict extraction to this schema or schema pattern (uses LIKE pattern with %)--entity
: restrict extraction to this entity or entity pattern (uses LIKE pattern with %)--sample-size
: defines how many items are used to infer a schema (for document databases or json fields)--mixed-json
: split collections given the specified json field (if you have several kind of documents in the same collection)--infer-json-attributes
: if JSON fields should be fetched to infer their schema--infer-polymorphic-relations
: if kind field on polymorphic relations should be fetched to know all relations--infer-relations
: build relations based on column names, for example auser_id
will have a relation if a tableusers
has anid
column--ignore-errors
: do not stop export on errors, just log them--log-queries
: log queries when executing them--format
: default tojson
but for relational database it could also besql
--output
: database name will be inferred from url and prefixed by the timestamp
Convert
Convert a dialect to another, supporting AML, SQL (PostgreSQL for now), JSON, Markdown, Mermaid...
npx azimutt@latest convert <file_path> --from <dialect> --to <dialect>
Input dialects: aml
, amlv1
, json
Output dialects: aml
, amlv1
, postgres
, mermaid
, markdown
, json
Options:
--out <file_path>
: to choose the file to write (will be constructed otherwise)
Diff
(Work In Progress)
Make a schema diff between two databases.
npx azimutt@latest <db_url_reference> <db_url_validation>
It will produce a JSON diff, that could be converted to SQL.
Developing
Start with pnpm install
to install dependencies and set up the CLI, then you have:
pnpm run exec
launch the CLI (use-- args
for CLI args, ex:npm run exec -- export postgresql://postgres:postgres@localhost:5432/azimutt_dev
), ornpm run build && npm run exec
pnpm run start
to launch it with live reload (same, use-- args
to pass arguments to the CLI)pnpm run test
to launch tests
Issues:
- upgrading to typescript 5.6.2 cause the error:
TypeError: Cannot read properties of undefined (reading 'sourceFile')
when running tests :/ - importing @azimutt/aml fails as it's commonjs, not es module :/
Publish
- update
package.json
andsrc/version.ts
version - update lib versions (
pnpm -w run update
+ manual) - test with
pnpm run dry-publish
and checkazimutt-x.y.z.tgz
content - launch
pnpm publish --no-git-checks
View it on npm.
Dev
If you need to develop on multiple libs at the same time (ex: want to update a connector and try it through the CLI), depend on local libs but publish & revert before commit.
- Depend on a local lib:
pnpm add <lib>
, ex:pnpm add "@azimutt/models
- "Publish" lib locally by building it:
pnpm run build