(Use Case) Automating API Data Loading (⭐⭐⭐)

Date of creation
Name
(Use Case) Automating API Data Loading (⭐⭐⭐)
Created by
  • data_popcorn

Prerequisite

Usage examples

After creating a DB, query it with AWS Athena → connect to the BI dashboard.

Scenario

💡
One-line summary: The task is to automate the process of retrieving emergency room-related data using the “Emergency Room API,” processing the data, and then saving it to Google Sheets and AWS S3.
1.
Check the API guide in the above Source & obtain API_KEY.
2.
Enter API_KEY in the API Key node in the workflow.
3.
Adjust input parameters through API Call node settings.
4.
Upload results to Google Spreadsheets or AWS S3.
1. Schedule Trigger : Runs the workflow at set time intervals.
2. Setting the API Key : Store the user's emergency room API key in a variable called SERVICE_KEY to be used for subsequent API calls.
3. API Call (Get Data) : Call the public API that provides emergency room information in Korea to get the list of emergency room data in Seoul.
4. Split Out : Separate emergency room data into individual pieces for easy processing and storage.
5. Cast Type : This is a code node that processes data and converts the value of a specific column, hvidate, from a number to a string.
6. Convert to File : Convert data to a file.
7. AWS S3 Upload : Upload the converted file to the specified AWS S3 bucket, and the file name is saved based on the current date and time (yyyyMMddHH).
8. Connect to Google Sheets : The workflow uploads the imported emergency room data to Google Sheets. First, it clears the existing data in the sheet through the “Clear Sheet” node, and then adds new data through the “Append Sheet” node.
9. Sticky Notes : Contains notes explaining precautions or setup methods when using the workflow.
This workflow periodically fetches emergency room-related data, processes it, and automatically stores it in Google Sheets and AWS S3.