Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/dune #100

Draft
wants to merge 57 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
6788ae5
add `gas-refund:generate-dune-query` command
alexshchur Oct 2, 2023
e3cde9a
add `gas-refund:index-dune-transactions` command
alexshchur Oct 2, 2023
1162dee
add axious-curlirize to better trace what HTTP calls script does
alexshchur Oct 2, 2023
f309a0c
add a retry logic to web3 provider (temp solution to unblock)
alexshchur Oct 2, 2023
5eb9a97
rely on dune instead of covalent and subgraph
alexshchur Oct 2, 2023
84a7a44
Merge branch 'master' of github.com:paraswap/paraswap-volume-tracker …
alexshchur Oct 10, 2023
4594022
Merge branch 'master' of github.com:paraswap/paraswap-volume-tracker …
alexshchur Dec 4, 2023
75ad398
Merge branch 'master' of github.com:paraswap/paraswap-volume-tracker …
alexshchur Dec 18, 2023
6532905
Merge branch 'master' of github.com:paraswap/paraswap-volume-tracker …
alexshchur Jan 2, 2024
0171e6b
fix: indexing past epochs, that already has a record in claimable GRP…
alexshchur Jan 3, 2024
176cbde
optim: dont' fetch redundant transactions on indexing
alexshchur Jan 3, 2024
0b9fee6
optim + fix: [draft] bring initState point to the current indexed epo…
alexshchur Jan 3, 2024
3f79651
chore: annotation
alexshchur Jan 3, 2024
b93ae45
fix: reindexing takes too long, and some claims were missing
alexshchur Jan 3, 2024
c96b627
chore: better place for util
alexshchur Jan 4, 2024
2e458f4
chroe: introduce env.ts
alexshchur Jan 4, 2024
a8e9957
chore: improve some naming
alexshchur Jan 4, 2024
30682ce
chore: remove unneeded filter
alexshchur Jan 4, 2024
48d212b
chore: cleanup
alexshchur Jan 4, 2024
c7d939a
chore: remove noize from pr
alexshchur Jan 4, 2024
9d57b3e
chore: naming
alexshchur Jan 4, 2024
44b33a8
chore: naming
alexshchur Jan 4, 2024
9be3af6
chore: get rid of redudant env var
alexshchur Jan 4, 2024
38cb09a
fix: logic flaw in previous commit
alexshchur Jan 4, 2024
181c7ba
chore: fix some annotations
alexshchur Jan 4, 2024
29a9760
Merge branch 'master' of github.com:paraswap/paraswap-volume-tracker …
alexshchur Jan 8, 2024
13dbc7f
optim: dont' fetch unneeded data from dune
alexshchur Jan 8, 2024
49cccab
fix: don't skip claims for after-fix epochs
alexshchur Jan 8, 2024
19fbf2c
Merge branch 'master' of github.com:paraswap/paraswap-volume-tracker …
alexshchur Jan 8, 2024
31077a8
fix some shapes of distribution files
alexshchur Jan 29, 2024
dfe1833
feat: add 43 epoch distribution to config
alexshchur Jan 29, 2024
4d3c1f4
Merge branch 'master' of github.com:paraswap/paraswap-volume-tracker …
alexshchur Jan 29, 2024
e046e3b
Merge branch 'master' of github.com:paraswap/paraswap-volume-tracker …
alexshchur Feb 26, 2024
35519e2
easier provider fix
alexshchur Mar 24, 2024
98ea043
add augustus v6
alexshchur Mar 26, 2024
bf52603
Merge branch 'master' of github.com:paraswap/paraswap-volume-tracker …
alexshchur Apr 22, 2024
c463d70
Merge remote-tracking branch 'origin' into feat/dune
alexshchur May 10, 2024
dfccbe8
fix: add augustus 6.0 to tx resolver
alexshchur May 10, 2024
93edd87
more verbosity
alexshchur May 20, 2024
be14bc9
add more scripts to organize the process better
alexshchur May 20, 2024
0c96b18
split some scripts by epoch
alexshchur Jun 17, 2024
79c9e06
Merge remote-tracking branch 'origin/master' into feat/dune
alexshchur Jun 17, 2024
a8903e6
epoch 48 config for aura rewards
alexshchur Jul 3, 2024
4bf255e
update some scripts
alexshchur Jul 3, 2024
a83be5b
add augustus v6.1
alexshchur Jul 3, 2024
02ab078
Merge remote-tracking branch 'origin' into feat/dune
alexshchur Jul 3, 2024
822c6be
fix generating dune query
alexshchur Jul 3, 2024
9990a59
adjust dune data flow for v6 with latest changes in master
alexshchur Jul 3, 2024
e27bc4f
greedier fetch tx (no way multiple epochs can be indexed within a sin…
alexshchur Jul 3, 2024
9da5308
Merge remote-tracking branch 'origin' into feat/dune
alexshchur Jul 11, 2024
3acf03c
add augustus v6.2
alexshchur Jul 16, 2024
2a97b1b
add success=true filter
alexshchur Jul 16, 2024
b84aab4
add 49 epoch config
alexshchur Jul 16, 2024
d2de050
Merge branch 'feat/delta' of https://github.com/paraswap/paraswap-vol…
alexshchur Sep 6, 2024
f5e703b
feat: delta adjustments
alexshchur Sep 6, 2024
6d8c89b
fix: amend delta USD computation
alexshchur Sep 7, 2024
33a70d2
Merge branch 'master' of https://github.com/paraswap/paraswap-volume-…
alexshchur Oct 30, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,17 @@
"gas-refund:prod:compute-gas-refund-save-db": "node scripts/gas-refund-program/computeGasRefund.js",
"gas-refund:computeDistributionDataAndPersistDB": "patch-package && NODE_ENV=development ts-node scripts/gas-refund-program/distribution/computeDistributionDataAndPersistDB",
"gas-refund:computeDistributionDataAndPersistDB-epoch-47": "DISTRIBUTED_EPOCH=47 yarn gas-refund:computeDistributionDataAndPersistDB",
"gas-refund:computeDistributionDataAndPersistDB-epoch-48": "DISTRIBUTED_EPOCH=48 yarn gas-refund:computeDistributionDataAndPersistDB",
"gas-refund:computeDistributionFilesAndPersistIPFS": "patch-package && NODE_ENV=development ts-node scripts/gas-refund-program/distribution/computeDistributionFilesAndPersistIPFS",
"gas-refund:generate-dune-query": "patch-package && NODE_ENV=development ts-node scripts/gas-refund-program/generate-dune-query",
"gas-refund:index-dune-transactions": "patch-package && NODE_ENV=development NODE_OPTIONS='--max-old-space-size=8200' ts-node scripts/gas-refund-program/index-dune-transactions",
"gas-refund:prepare-database-for-DuneTransactions-17": "source .env && psql $ROOT_DB_URI -c \"drop database if exists volume_tracker_dune_transactions_epoch_17_aka_47\" && psql $ROOT_DB_URI -c \"create database volume_tracker_dune_transactions_epoch_17_aka_47\"",
"gas-refund:prepare-database-for-DuneTransactions-18": "source .env && psql $ROOT_DB_URI -c \"drop database if exists volume_tracker_dune_transactions_epoch_18_aka_48\" && psql $ROOT_DB_URI -c \"create database volume_tracker_dune_transactions_epoch_18_aka_48\"",
"gas-refund:index-dune-transactions-epoch-015": "source .env && cat ./.vscode/txses-archive/dune-response-15-with-augustus-v6.json | DATABASE_NAME=volume_tracker_dune_transactions_epoch_015_aka_46 DATABASE_URL=$DUNE_TRANSACTIONS_DATABASE_URL_015 yarn gas-refund:index-dune-transactions",
"gas-refund:index-dune-transactions-epoch-17": "source .env && cat ./.vscode/txses-archive/dune-response-17.json | DATABASE_NAME=volume_tracker_dune_transactions_epoch_17_aka_47 DATABASE_URL=$DUNE_TRANSACTIONS_DATABASE_URL_17 yarn gas-refund:index-dune-transactions # it's confusing that the used new-style index here is human-readable, not the index of the epoch",
"gas-refund:index-dune-transactions-epoch-18": "source .env && cat ./.vscode/txses-archive/dune-response-18.json | DATABASE_NAME=volume_tracker_dune_transactions_epoch_18_aka_48 DATABASE_URL=$DUNE_TRANSACTIONS_DATABASE_URL_18 yarn gas-refund:index-dune-transactions",
"gas-refund:dev:dune:compute-gas-refund-save-db-18": "source .env && psql $DATABASE_URL -c 'drop table \"DuneTransactions\"'; yarn _copy-indexed-dune-transactions-18; yarn gas-refund:dev:compute-gas-refund-save-db",
"_copy-indexed-dune-transactions-18": "source .env && /usr/lib/postgresql/16/bin/pg_dump -t '\"DuneTransactions\"' $DUNE_TRANSACTIONS_DATABASE_URL_18 | psql $DATABASE_URL",
"migrate:up": "source .env && DATABASE_URL=$DATABASE_URL npx sequelize-cli db:migrate # <- executes any new migrations that are not in sequalize meta table yet, sorted alphabetically",
"migrate:undo": "source .env && DATABASE_URL=$DATABASE_URL npx sequelize-cli db:migrate:undo # <- undoes the last migration from sequalize meta table, sorted alphabetically",
"test": "jest"
Expand Down
151 changes: 151 additions & 0 deletions scripts/gas-refund-program/generate-dune-query.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,151 @@
import * as dotenv from 'dotenv';
dotenv.config();
import '../../src/lib/log4js';

import {
getCurrentEpoch,
resolveEpochCalcTimeInterval,
} from '../../src/lib/gas-refund/epoch-helpers';
import { GRP_SUPPORTED_CHAINS } from '../../src/lib/gas-refund/gas-refund';
import { getContractAddresses } from './transactions-indexing/transaction-resolver';
import * as moment from 'moment';

import { MIGRATION_SEPSP2_100_PERCENT_KEY } from './staking/2.0/utils';
import { isTruthy } from '../../src/lib/utils';
import { CHAIN_ID_OPTIMISM } from '../../src/lib/constants';

export const CHAIN_ID_TO_DUNE_NETWORK: Record<number, string> = {
1: 'ethereum',
56: 'bnb',
137: 'polygon',
250: 'fantom',
10: 'optimism',
42161: 'arbitrum',
43114: 'avalanche-c',
};

export function timestampToDuneFormatted(timestamp: number) {
return `'${moment.unix(timestamp).utc().format('YYYY-MM-DD HH:mm:ss')}'`;
}

function getContractsByChainId() {
const currentEpoch = getCurrentEpoch();
const contractAddressesByChainId = Object.fromEntries(
GRP_SUPPORTED_CHAINS.map(chainId => [
chainId,
getContractAddresses({ epoch: currentEpoch, chainId }).filter(
address => address !== MIGRATION_SEPSP2_100_PERCENT_KEY,
),
]),
);
return contractAddressesByChainId;
}

// @TODO: probably should use some tempating engine here
async function generateDuneQuery() {
const currentEpoch = getCurrentEpoch();
const { startCalcTime, endCalcTime } = await resolveEpochCalcTimeInterval(
currentEpoch - 1,
);
const dateFrom = timestampToDuneFormatted(startCalcTime);
const dateTo = timestampToDuneFormatted(endCalcTime);

const conractsByChainId = getContractsByChainId();

const parts = GRP_SUPPORTED_CHAINS.map(chainId => {
const network = CHAIN_ID_TO_DUNE_NETWORK[chainId];
const transactionsInvolvingContract = `transactionsInvolvingContract_${network}`;
const contracts = [...conractsByChainId[chainId]].join(',');

const txTableColumns = `
from
gas_price
hash
to
block_number
block_time
gas_used
l1_fee
success
`
.split(/\n/)
.map(s => s.trim())
.filter(isTruthy);

const txTableColumnsPart =
chainId === CHAIN_ID_OPTIMISM
? txTableColumns
.map(s => `cast(transactions."${s}" as varchar) as "${s}"`)
.join(', ')
: txTableColumns
.map(s =>
s.includes('l1_')
? `'n/a' as "${s}"`
: `cast(transactions."${s}" as varchar) as "${s}"`,
)
.join(', ');

const networkData = `networkData_${network}`;
const query = `

${transactionsInvolvingContract} as (
select
tx_hash,
max(to) as contract,
max(block_time),
max(block_number) as block_number
from
${network}.traces
where
block_time >= to_timestamp(${dateFrom}, 'yyyy-mm-dd hh24:mi:ss')
and block_time <= to_timestamp(${dateTo}, 'yyyy-mm-dd hh24:mi:ss')
and to in (${contracts})
group by
tx_hash
order by
max(block_time) desc
),
${networkData} as (
select
${chainId} as chainId, ${transactionsInvolvingContract}.contract as contract, ${txTableColumnsPart}
from
${transactionsInvolvingContract}
left join ${network}.transactions as transactions on ${transactionsInvolvingContract}.block_number = transactions.block_number
and ${transactionsInvolvingContract}.tx_hash = transactions.hash
and transactions.block_time >= to_timestamp(${dateFrom}, 'yyyy-mm-dd hh24:mi:ss')
and block_time <= to_timestamp(${dateTo}, 'yyyy-mm-dd hh24:mi:ss')
where transactions.success = true
)`;

return [networkData, query];
});

const queries = parts.map(([, query]) => `${query}`).join(',\n');

const unionPart = parts
.map(([networkData]) => `(select * from ${networkData})`)
.join(' UNION \n');

return `with ${queries} SELECT * from (\n${unionPart}) ORDER BY block_time DESC`;
}

async function main() {
const query = await generateDuneQuery();
console.log('________________________________________________');
console.log(
"-- This is a generated query. Don't modify it manually, as it'll get overwritten by script",
);
console.log(query);
console.log('________________________________________________');
console.log('Use the above output here https://dune.com/queries');
}

main()
.then(res => {
console.log('script finished', res);
process.exit(0);
})
.catch(error => {
console.error('script failed', error);
process.exit(1);
});
117 changes: 117 additions & 0 deletions scripts/gas-refund-program/index-dune-transactions.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,117 @@
import * as dotenv from 'dotenv';
dotenv.config();
import '../../src/lib/log4js';
import { sliceCalls } from '../../src/lib/utils/helpers';
import Database from '../../src/database';
import { DuneRow, DuneTransaction } from '../../src/models/DuneTransaction';

import * as fs from 'fs';

const tmpDirname = './dune-split-data-temp';

// returns tempFolder
function readStdinAndSliceIntoTempFolder() {
var stdinBuffer = fs.readFileSync(0); // STDIN_FILENO = 0

const data = JSON.parse(stdinBuffer.toString());

console.log('data', data);

// this is from-browser GQL response
//const origRows = data.data.get_execution.execution_succeeded.data;

const origRows = data.result.rows;

fs.mkdirSync(tmpDirname, { recursive: true });

sliceCalls({
inputArray: origRows,
execute: (_rows, sliceIdx) => {
require('fs').writeFileSync(
`${tmpDirname}/slice-${sliceIdx.toString().padStart(3, '0')}.json`,
JSON.stringify(_rows, null, 2),
);
},
sliceLength: 10000,
});

// this is from-browser GQL response
// const columns = data.data.get_execution.execution_succeeded.columns;
const columns = data.result.metadata.column_names;
require('fs').writeFileSync(
`${tmpDirname}/_columns.json`,
JSON.stringify(columns, null, 2),
);
}

function cleanupTempFolder() {
fs.rmSync(tmpDirname, { recursive: true, force: true });
}

async function loadSlicesIntoDb() {
const files = fs
.readdirSync(tmpDirname)
.filter(file => !!file.match(/slice-\d+.json/));

await Database.connectAndSync('transform-dune-response');
await DuneTransaction.destroy({ truncate: true });

for (let i = 0; i < files.length; i++) {
const filename = files[i];
const origRows: DuneRow[] = JSON.parse(
fs.readFileSync(`${tmpDirname}/${filename}`).toString(),
);

const rows: Partial<DuneRow>[] = origRows.map(row => ({
...Object.fromEntries(
Object.entries(row).map(([key, value]) => [
key,
value === 'n/a' ? null : value,
]),
),
block_timestamp: row.block_time
? Date.parse(row.block_time) / 1000
: undefined, // looks like 2023-08-31 22:14:05.000 UTC
}));

const queries = sliceCalls({
inputArray: rows,
execute: async (_rows, sliceIdx) => {
try {
await DuneTransaction.bulkCreate(_rows);
} catch (e) {
// error in bulk? try one by one and find out which row is faulty
try {
await Promise.all(
_rows.map(async row => {
try {
const res = await DuneTransaction.bulkCreate([row]);
} catch (e) {
// to debug in individual level if smth goes wrong
console.error('failed to insert individual', e);

throw e;
}
}),
);
} catch (e) {
throw e;
}
throw e;
}
console.log(`inserted slice ${sliceIdx}. Rows length:`, _rows.length);
},
sliceLength: 1000,
});
}
}

async function main() {
readStdinAndSliceIntoTempFolder();

await loadSlicesIntoDb();

cleanupTempFolder();
}

main();
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,14 @@ function constructTransactionsProcessor({
`could not retrieve psp/chaincurrency same day rate for swap at ${transaction.timestamp}`,
);

const currGasUsedChainCur = gasSpentInChainCurrencyWei
const currGasUsedChainCur = transaction.txGasUsedUSD // if USD override is available, most likely it's delta -> adjust spent eth and psp to refund based on that
? new BigNumber(
new BigNumber(transaction.txGasUsedUSD)
.multipliedBy(10 ** 18)
.dividedBy(currencyRate.chainPrice)
.toFixed(0),
)
: gasSpentInChainCurrencyWei
? new BigNumber(gasSpentInChainCurrencyWei)
: new BigNumber(txGasUsed).multipliedBy(
transaction.txGasPrice.toString(),
Expand Down Expand Up @@ -248,24 +255,31 @@ export async function fetchRefundableTransactions({
return result.flat();
}),

...Array.from(AUGUSTUS_SWAPPERS_V6_OMNICHAIN).map(async contractAddress => {
const epochNewStyle = epoch - GasRefundV2EpochFlip;

const lastTimestampProcessed = lastTimestampTxByContract[contractAddress];

const allStakersTransactionsDuringEpoch =
await fetchParaswapV6StakersTransactions({
epoch: epochNewStyle,
timestampGreaterThan: lastTimestampProcessed,
chainId,
address: contractAddress,
});

return await processRawTxs(
allStakersTransactionsDuringEpoch,
(epoch, totalUserScore) => getRefundPercent(epoch, totalUserScore),
);
}),
// in this branch v6 txs are sitting together with v5 in global config
...(chainId != 1
? [] // only for chain id=1 take data from metabase
: Array.from(AUGUSTUS_SWAPPERS_V6_OMNICHAIN).map(
async contractAddress => {
const epochNewStyle = epoch - GasRefundV2EpochFlip;

const lastTimestampProcessed =
lastTimestampTxByContract[contractAddress];

const allStakersTransactionsDuringEpoch =
await fetchParaswapV6StakersTransactions({
epoch: epochNewStyle,
timestampGreaterThan: lastTimestampProcessed,
chainId,
address: contractAddress,
});

return await processRawTxs(
allStakersTransactionsDuringEpoch,
(epoch, totalUserScore) =>
getRefundPercent(epoch, totalUserScore),
);
},
)),
]);

const flattened = allTxsV5AndV6Merged.flat();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ export async function fetchRefundableTransactionsAllChains() {
'cannot compute refund data for epoch < genesis_epoch',
);

for (let epoch = startEpoch; epoch <= getCurrentEpoch(); epoch++) {
for (let epoch = startEpoch; epoch < startEpoch + 1; epoch++) {
const { startCalcTime, endCalcTime } =
await resolveEpochCalcTimeInterval(epoch);

Expand Down
Loading