Retrieve JSON file into Google Sheets for Looker Studio to read

Hello community!
@Tony_KRTand I are looking for a way to retrieve data from a JSON file accessed from an API with Basic HTTP Encryption.

This is the structure of the JSON file:

{
- "results": [
   + {...},
   + {...}....
],
- "next_page_url": [url for the next page]
}

The "next_page_url" contains an URL in which I can retrieve another JSON file with the same structure.

I'm able to retrieve the first set of "results" and then writing it into a Google Sheets; then I'm able to retrieve the "next_page_url" section but I'm having trouble appending the "next_page_url" data into the Google Sheets without headers.


Here is the code that I'm using in Google Apps Script:

 

function updateJsonData() {
  var ss = SpreadsheetApp.getActive();
  var rangeToUpdate = ss.getRange('Sheet1!A1');
  const baseUrl = {baseurl}
  var url = {url}
  var username = {username};
  var password = {password};
 
  const auth = "Basic " + Utilities.base64Encode("username:password");
    let options = {
    method: "get",
    headers: {
      Authorization: auth
    },
    muteHttpExceptions: true
  };

  let response = UrlFetchApp.fetch(url, options);
  let json = JSON.parse(response.getContentText());
  let results = json["results"];

  var ss = SpreadsheetApp.getActive();
  var srcSheet = ss.getActiveSheet();
  srcSheet.getRange("A:BP").clearContent();
  while (json["next_page_url"]) {
    let nextUrl = baseUrl + json["next_page_url"];
    response = UrlFetchApp.fetch(nextUrl, options);
    json = JSON.parse(response.getContentText());
    Logger.log(json);
    results = results.concat(json["results"])
  }

  rangeToUpdate.offset(0, 0, results.length, results[0].length).setValues(results);
}


Could anyone help me in telling me what I'm doing wrong?
Also, sorry in advance if I'm not following community rules, I'm new here.
Thanks in advance for the help provided!
Solved Solved
0 4 142
1 ACCEPTED SOLUTION

Hi @miguel_krt ,

You’re close! The key issue is that results.concat(json["results"]) builds a combined array, but you never update the final sheet with all pages’ data — you only write once at the end and assume the array is shaped for setValues().

Here’s what to fix:
- Build a flat 2D array with only the data (no headers)
- Ensure you match the sheet’s expected format when writing

function updateJsonData() {
var ss = SpreadsheetApp.getActive();
var sheet = ss.getActiveSheet();
sheet.clearContents();

const baseUrl = '{baseurl}';
var url = '{url}';
var username = '{username}';
var password = '{password}';

const auth = "Basic " + Utilities.base64Encode(username + ":" + password);
let options = {
method: "get",
headers: { Authorization: auth },
muteHttpExceptions: true
};

let allResults = [];
let response = UrlFetchApp.fetch(url, options);
let json = JSON.parse(response.getContentText());
let results = json["results"];

allResults = allResults.concat(results);

while (json["next_page_url"]) {
let nextUrl = baseUrl + json["next_page_url"];
response = UrlFetchApp.fetch(nextUrl, options);
json = JSON.parse(response.getContentText());
allResults = allResults.concat(json["results"]);
}

// Convert objects to arrays (if needed)
let rows = allResults.map(obj => Object.values(obj));

// Write to sheet
if (rows.length > 0) {
sheet.getRange(1, 1, rows.length, rows[0].length).setValues(rows);
}
}

View solution in original post

4 REPLIES 4

Hi @miguel_krt ,

You’re close! The key issue is that results.concat(json["results"]) builds a combined array, but you never update the final sheet with all pages’ data — you only write once at the end and assume the array is shaped for setValues().

Here’s what to fix:
- Build a flat 2D array with only the data (no headers)
- Ensure you match the sheet’s expected format when writing

function updateJsonData() {
var ss = SpreadsheetApp.getActive();
var sheet = ss.getActiveSheet();
sheet.clearContents();

const baseUrl = '{baseurl}';
var url = '{url}';
var username = '{username}';
var password = '{password}';

const auth = "Basic " + Utilities.base64Encode(username + ":" + password);
let options = {
method: "get",
headers: { Authorization: auth },
muteHttpExceptions: true
};

let allResults = [];
let response = UrlFetchApp.fetch(url, options);
let json = JSON.parse(response.getContentText());
let results = json["results"];

allResults = allResults.concat(results);

while (json["next_page_url"]) {
let nextUrl = baseUrl + json["next_page_url"];
response = UrlFetchApp.fetch(nextUrl, options);
json = JSON.parse(response.getContentText());
allResults = allResults.concat(json["results"]);
}

// Convert objects to arrays (if needed)
let rows = allResults.map(obj => Object.values(obj));

// Write to sheet
if (rows.length > 0) {
sheet.getRange(1, 1, rows.length, rows[0].length).setValues(rows);
}
}

Thank you!
This worked as per our needs!
Really appreciate the help provided!

Hi,


One way you could get the results you’re looking for is by splitting each page of your URL into its own API. So, if you’ve got, say, 5 pages in your JSON data source, you’d create 5 APIs.
For smaller datasets, you can stick to using just one Google Sheets workbook with 5 different sheets for each API. But if you’re dealing with a huge dataset, it might be better to go with one workbook per API.


Once you have all that set up, you can use Apps Script to merge the data from each sheet into one. That should help you get what you need.


Just a heads up, Google Sheets has some limits:


• 10 million cells per sheet
• 18,278 columns
• 100 MB file size
• 50,000 characters per cell


Apps Script also has its own restrictions:


• 6 minutes for simple triggers, 30 minutes for installable triggers (if you’ve got a paid account)
• 50 MB max for the script
• 1 million read requests and 500,000 write requests per day


If these limits are a bit much, BigQuery could be a good alternative. It’s much better at handling large datasets and doesn’t have the same restrictions as Google Sheets. Plus, the free tier is pretty generous, 1 terabyte of data processing and 10 GB of storage each month. You can also pull data from APIs, which makes it perfect for automating data collection and analysis. It’s also super fast, so it works really well with Looker Studio.


Hope this helps! Don’t hesitate to ask if you have more questions.

Thanks for the reply!
I don't think an API per page would be the best approach because there is no established number of pages, it changes dynamically.

However, I do appreciate the information regarding Big Query. Do you have any site that I can review as per the Free Tier features? I find the Pricing site very confusing.

Thanks again!