Fetch Multiple Pages

Pagination and Storing Transactions

We're now going to write a script that is a bit more "production-ready".

It will paginate until all transactions for a given time range are retrieved, and it will also store them in an array for later processing.

Updated Fetch Function

Let's modify fetchTransactions.js to include a loop for fetching multiple pages and an array for storing transactions:

const axios = require('axios');

const fetchAllTransactions = async (
) => {
    let transactions = [];
    let hasNextPage = true;
    let nextPageUrl = `https://translate.noves.fi/evm/${chain}/txs/${accountAddress}`;

    while (hasNextPage) {
        try {
            const response = await axios.get(nextPageUrl, {
                headers: { apiKey: `${process.env.NOVES_API_KEY}` },
                params: { startTimestamp: startTime, endTimestamp: endTime },

            transactions = transactions.concat(response.data.items);
            hasNextPage = response.data.hasNextPage;

            if (hasNextPage) {
                nextPageUrl = response.data.nextPageUrl;
        } catch (error) {
            console.error('Error fetching transactions:', error);
            hasNextPage = false;

    return transactions;

const chain = 'eth'; // Replace with the desired chain
const accountAddress = '0x....'; // Replace with the desired account address
const startTime = 1640995200; // Start timestamp for January 1, 2022
const endTime = 1672531199; // End timestamp for December 31, 2022

fetchAllTransactions(chain, accountAddress, startTime, endTime).then(
    (transactions) => {
        console.log('Total transactions fetched:', transactions.length);

This script now accumulates all fetched transactions in the transactions array.

Once the fetching process is complete, you can process this array and export the data to a CSV file (we'll see how to do this in the next pages).

The final console.log statement provides a quick summary of the total transactions fetched, which is helpful for verification before moving on to further data processing.