Handle amazon report binary problem

Hi there,

I try to handle an amazon report. It is called

GET_LEDGER_SUMMARY_VIEW_DATA

To realize this, I copied an already existing scenario with another report, changing just the report to retrieve.

But now I run into trouble, because I can’t get the original file from the binary code the module downlaods. I thought to use the same settings as in the other scenario, but unfortunately this doesn’t work. I guess the report fetched fromm amazon might be very different, from technical aspects, but since I can’t see any usable result I am not able to progress.

This is what I get:

but the previous module seems to get the file correctly, since the result looks fine:

However, when I use the same setting, I use with the other scenario which is parse csv with the data-result inside a tostring() function like this

I get the code rubbish showed above. I tried various things however there is no effort.

How can I handle this?

Thx

There is an encoding issue when downloading certain files.

I have mentioned more details, and a possible workaround for this here: Read text in .txt file stored in google drive - #4 by samliew

You can try using a “Convert encoding” module using the same input and output encoding, then you can use the output of this module in a subsequent module (like Parse CSV or JSON).

Alternatively, you can simply try using the toString built-in function in the mapped field, something like this:  toString(data)  – however this method doesn’t work in some cases where the encoding actually has to be converted.

2 Likes

hallo samliew, thank you for enganging in my issue. Unfortunately none of the following settings work for me

using “Convert encoding” module,

plain input data with same input and output encoding (UTF8)

toString(input data) with same input and output encoding (UTF8)

and
toString(input data) with only output encoding (UTF8)

this is strange. Since the downloaded file looks fine in binary, I think there must be a way to make the file somehow handable?

I just could solve it by reading my other post with similiar problem in the past:

we have to use a GUnzip module before using the data

2 Likes

Oooh, that’s weird.

Now we have someone saying “don’t use Gunzip module” in an older thread here:

Update… I turfed the GUNZIP module and am using the Convert Encoding module. It outputs a single line text which is converted into a table if the header is ignored.
Amazon Seller Central - GUNZIP Error - #17 by Ricardo_da_Costa

And now you’re saying Gunzip works with the same “Download a Report Document”.

The only difference I can see is one contains JSON data, and yours is CSV data.

So Amazon might be applying GZIP compression to only the CSV downloads.

2 Likes

Well still having issues with that file. I thought the enconding problem is solved, since I can see some text stuff! :smiley:

But now I see that the file only contains the header. Whatever setting I use, there is no content. To check the file first, I only send it to my google cloud, uploading it as a .txt file.

But all exports do contain only the header

Do you have any idea why this is so?

Try getting the JSON export, like how the user in the other thread is doing.

2 Likes

hmm this is very strange. No the output isn’t a JSON. It is a tab delimited text file. The only thing is:

it is empty!

When I get same report from Sellercentral, with same settings it has 98 Rows. Probably this is a problem within the make get report module?

Ok, sometimes it is good, to wait a while before continuing with the error finding. Finally I could identify why my reports were empty:

In case you use the ledger summary report, and you define a period aggregation, it is necessary to set the start and end time as defined as period.

For example, if you select monthly the start date must be the first day of the month and the end day the last one

Now it is fine, with Gunzip and csv module

3 Likes

Hello @Arwin :wave:

I just want to quickly say congratulations on getting this up and running with the guidance of @samliew :clap:

Thanks a lot for stepping back in here and sharing your final insights and findings with the rest of the community. This is incredibly valuable and can be super helpful to others looking for similar information. :pray:

2 Likes

Hi @Arwin

I had the same issue as you and fixed it the same way. However, I have not had success retrieving data for 2024 (2023 is working fine.)
Is it working for you?