Skip to content

Commit

Permalink
remove dashboard service
Browse files Browse the repository at this point in the history
  • Loading branch information
Tburm committed Oct 3, 2024
1 parent 225e77c commit 3a781de
Show file tree
Hide file tree
Showing 67 changed files with 9 additions and 9,474 deletions.
3 changes: 0 additions & 3 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,5 @@ DASHBOARD_PORT=8502
AIRFLOW_PORT=8080
AIRFLOW_BASE_URL=

# dashboard
LEGACY_DB_LOCATION=

# snaxchain api key
CONDUIT_API_KEY=
6 changes: 0 additions & 6 deletions .streamlit/config.toml

This file was deleted.

11 changes: 0 additions & 11 deletions .streamlit/secrets_example.toml

This file was deleted.

20 changes: 9 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@ A collection of services to index, store, and transform data related to Syntheti
At a high level, the data stack contains a set of services to listen to onchain events, store that event data in a Postgres database, then transform and aggregate that data into a format that is useful for analytics.

Read more about each service:
* [**Database**](./postgres/) - A Postgres database used to store raw and transformed data.
* [**Indexers**](./indexers/) - Blockchain indexers using Subsquid archives to index Synthetix smart contracts. These indexers are used to populate a Postgres database with raw event log data.
* [**Extractors**](./extractors/) - Scripts that extract blockchain data using `eth_call` RPC calls and [cryo](https://github.com/paradigmxyz/cryo). Data is stored in the a `parquet-data` directory, and will be imported into the Postgres database using the Transformers service.
* [**Transformers**](./transformers/) - Services that transform raw event log data into a format that is more useful for querying. These services are used to populate a Postgres database with transformed data using [dbt](https://www.getdbt.com/).
* [**Dashboard**](./dashboard/) - A collection of dashboards built using [streamlit](https://streamlit.io/) and connected directly to the Postgres database.

- [**Database**](./postgres/) - A Postgres database used to store raw and transformed data.
- [**Indexers**](./indexers/) - Blockchain indexers using Subsquid archives to index Synthetix smart contracts. These indexers are used to populate a Postgres database with raw event log data.
- [**Extractors**](./extractors/) - Scripts that extract blockchain data using `eth_call` RPC calls and [cryo](https://github.com/paradigmxyz/cryo). Data is stored in the a `parquet-data` directory, and will be imported into the Postgres database using the Transformers service.
- [**Transformers**](./transformers/) - Services that transform raw event log data into a format that is more useful for querying. These services are used to populate a Postgres database with transformed data using [dbt](https://www.getdbt.com/).

## Usage

Expand All @@ -25,16 +25,14 @@ The services are all managed using [docker compose](https://docs.docker.com/comp
- `PG_PASSWORD`: The password for the admin user of the Postgres database.
- `READONLY_PASSWORD`: The password for a configured read-only user, used for dashboards. Change this password and run `make reset-pw` to update the user's password.
- `DB_PORT`: The port that will be used to expose the Postgres database. If left blank, the database will only be exposed to the docker network.
- `DASHBOARD_PORT`: The port used to expose the streamlit dashboards.
- `LEGACY_DB_LOCATION`: The location of a legacy SQLite database. This database is used in Optimism Mainnet dashboards summarizing activity on the legacy perps V2 contracts.

### Start Indexers

Once you have configured your environment, run `docker compose up -d --build` to build and run the services in detached mode. By default, the service will start a Postgres database, indexers for each network, and a streamlit dashboard on startup. Each indexer will write data to a database corresponding with the network name (ex. `base_mainnet`). You can view the logs for each service using `docker compose logs -f <service-name>`.
Once you have configured your environment, run `docker compose up -d --build` to build and run the services in detached mode. By default, the service will start a Postgres database, indexers for each network, and an instance of Airflow. Each indexer will write data to a database corresponding with the network name (ex. `base_mainnet`). You can view the logs for each service using `docker compose logs -f <service-name>`.

The dashboard service relies on transformed data in the `analytics` database. To populate this database, you must run the transformers.
See the [data tools](https://github.com/Synthetixio/data-tools) for dashboards that consume this data.

### Run Extractors
## Run Extractors

To populate the `parquet-data` directory with data, you must run the extractors. These scripts will run the `eth_call` RPC method for each of the specified calls in the `extractors/main.py` file. To run the extractors, use the following command:

Expand All @@ -55,4 +53,4 @@ make import # import the data from the parquet files into the database
make dbt # run dbt for each network
```

You should see output confirming that dbt has run for each network, and created a set of tables and views in the `analytics` database. The running dashboard service will automatically detect these tables and views and populate the dashboards with data. To view the dashboards, visit `http://localhost:<DASHBOARD_PORT>` in your browser.
You should see output confirming that dbt has run for each network, and created a set of tables and views in the `analytics` database.
4 changes: 0 additions & 4 deletions dashboard/.dockerignore

This file was deleted.

77 changes: 0 additions & 77 deletions dashboard/About.py

This file was deleted.

18 changes: 0 additions & 18 deletions dashboard/Dockerfile

This file was deleted.

20 changes: 0 additions & 20 deletions dashboard/README.md

This file was deleted.

205 changes: 0 additions & 205 deletions dashboard/modules/all_chains/core_stats.py

This file was deleted.

Loading

0 comments on commit 3a781de

Please sign in to comment.