Import Large tab delimited Data to SharePoint with Power Automate without Premium license
Table of Contents
- Introduction
- Why Tab-Delimited Files?
- Step-by-Step Guide
- Tips & Troubleshooting
- Outcome Comparison
- Conclusion
- References
Introduction
Importing large datasets into SharePoint lists can be tricky, especially when dealing with CSV or tab-delimited files. This guide walks you through a reliable Power Automate approach, highlights common pitfalls, and shares practical tips for handling big data without premium connectors.
Why Tab-Delimited Files?
CSV files often contain commas and new lines within data, making parsing complex. Tab-delimited files (.txt) are more reliable for bulk imports, as splitting by tab (\t) produces consistent columns.
Step-by-Step Guide
1. Prepare Your Data
- Download sample data from Datablist. In this post I chose the 10k csv from Google Drive.
- Save as tab-delimited
.txtfrom Excel for best results.

2. Build the Flow for Tab-Delimited Files
Trigger:
- Use “When a file is created (properties only)” in SharePoint to monitor uploads.
Scope: Create_SharePoint_Items
- Group all actions related to processing the new file and creating SharePoint items.
Settings (Parse Json):
- Initialize settings like site URL, list name, batch size, and oData level.
Content:
{
"tenantBaseUrl": "4g6zf4.sharepoint.com",
"sitePath": "sites/Products",
"listName": "Products1",
"batchSize": 1000,
"oDataLevel": "nometadata"
}
Schema:
{
"type": "object",
"properties": {
"tenantBaseUrl": {
"type": "string",
"title": "Base URL of SharePoint Site",
"description": "Base URL of SharePoint Site"
},
"sitePath": {
"type": "string",
"title": "Relative path of the site"
},
"listName": {
"type": "string",
"title": "Sharepoint List",
"description": "List where items are to be batch created"
},
"batchSize": {
"type": "integer",
"title": "Batch Size",
"description": "Number of items to send in a single batch"
},
"oDataLevel": {
"type": "string",
"title": "oData Level",
"description": "verbose, minimalmetadata or nometadata"
}
}
}
Get_file_content:
- Retrieves the content of the newly created file using the settings.
- Site Address:
concat('https://',body('Settings')?['tenantBaseUrl'],'/',body('Settings')?['sitePath']) - File Identifier:
triggerOutputs()?['body/{Identifier}']

GenerateSPData (Select):
- From:
skip(split(trim(body('Get_file_content')), decodeUriComponent('%0A')), 1) - Map fields:
- Title:
split(item(), decodeURIComponent('%09'))[1] - Description:
split(item(), decodeURIComponent('%09'))[2]
- Title:
What this configuration does:
From expression breakdown:
trim(body('Get_file_content'))- Removes leading/trailing whitespace from the file contentsplit(..., decodeUriComponent('%0A'))- Splits the text by newlines (%0A= line break character)skip(..., 1)- Skips the first row (header row) and processes only data rows
Field mapping explanation:
split(item(), decodeURIComponent('%09'))- Splits each row by tab character (%09= tab delimiter)[1]and[2]- Extracts the second and third columns (zero-indexed) for Title and Description
Example: If your tab-delimited file has:
ID Title Description Category
001 Product A High-quality item Electronics
002 Product B Budget-friendly option Home
The expression will:
- Skip the header row “ID Title Description Category”
- Split “001 Product A High-quality item Electronics” by tabs
- Map “Product A” (index 1) to Title and “High-quality item” (index 2) to Description
Why tab-delimited is better: Unlike CSV files that can break when data contains commas, tab-delimited files use \t (tab) as separator, making column parsing more reliable.
Processes the file content, splitting it into fields such as Title, Description, Brand, Category, Price, Currency, Color, and Size.
Apply_to_each (Foreach):
- Select output:
chunk(body('GenerateSPData'), body('Settings')?['batchSize']) - Loops through the data in chunks (batches) based on the batch size.
The Apply to each can handle a maximum of 5000 loops if using the free licence and hence chunking is important to handle more than 5k items.

| License Tier | Apply to Each Limit |
|---|---|
| Low (e.g., M365, Free, Trial) | 5,000 items |
| All Others (Premium, Per Flow, Process) | 100,000 items |
See the official documentation for the latest details: Limits of automated, scheduled, and instant flows
Apply_to_each_1 (Foreach):
- Select output:
item() - Loops through each item in the batch.
Create_item:
- Use “Create item” (SharePoint) to add each row to your SharePoint List.

SiteAddress: @{concat(‘https://’,body(‘Settings’)?[’tenantBaseUrl’],’/’,body(‘Settings’)?[‘sitePath’])} List Name: body(‘Settings’)?[’listName’] Tile: @{item()?[‘Title’]} Description: @{item()?[‘Description’]}
Alternative: Import from Excel Table
- For structured data, use Excel tables and enable pagination in the
List rows present in a tableaction. See more: Read Rows in Excel Using Power Automate with ‘Create Table’ Action - Ensure to enable pagination and set the threshold.

- If you don’t have a premium license, the maximum threshold is 5k; otherwise, you can set it much higher.

Alternative: CSV Import
If your data does not contain commas in the value, this can be an alternative, see more from Bulk Import CSV to SharePoint with Power Automate
The advantage over Excel is it can handle more than 5k records without Premium license. The drawback is it won’t work depending if the values contain commas and import will fail like the error below.

Alternative: Batch Processing with _api/$batch
To efficiently import large datasets, use SharePoint’s batch API. Here’s how to set up batch processing in your flow: The steps are from Paul Murana’s blos post Power Automate: Batch Create SharePoint Items
Add Batch Template (Compose action):
--changeset_c1
Content-Type: application/http
Content-Transfer-Encoding: binary
POST /@{body('Settings')?['sitePath']}/_api/web/lists/getByTitle('@{body('Settings')?['listName']}')/items HTTP/1.1
Content-Type: application/json;odata=@{body('Settings')?['oDataLevel']}
Prefer: return=minimal
|RowData|
Chunk Data for Batching similar to above:
Add an Apply to each action using:
chunk(body('GenerateSPData'), body('Settings')?['batchSize'])
Within the loop, add a Select action (labelled PreBatch) Set the following properties:
- Body:
item() - Map:
replace(outputs('batchTemplate'), '|RowData|', string(item()))
Add a Send HTTP Request to SharePoint:
Set the propertied to
- Site Address:
@{concat('https://',body('Settings')?['tenantBaseUrl'],'/',body('Settings')?['sitePath'])} - Method: POST
- Uri:
/_api/$batch - Headers:
- X-RequestDigest: digest
- Content-Type: multipart/mixed;boundary=batch_b1
- Body:
--batch_b1
Content-Type: multipart/mixed; boundary="changeset_c1"
Content-Transfer-Encoding: binary
@{join(body('PrepBatch'), decodeUriComponent('%0A'))}
--changeset_c1--
--batch_b1--

The whole batch flow looks like:

Gotchas with Batch
- If you import 10,000 records in batches of 100, you may see only ~7k-9k itesm created with batch option as this method obfuscates any failures.
Tips & Troubleshooting
- Use expressions like
split(item(), decodeURIComponent('%09'))?[0]to access columns. Characters ‘%09’ is encoded/t - For large files, enable batching and pagination. Import in chunks (e.g., 1,000–5,000 rows per run). The maximum for Apply to Each is 5000, it will fail for collections more than 5k.
Outcome Comparison
| Method | Records Imported |
|---|---|
| Excel (batching) | 5,140 |
| CSV (batching) | 6,477 |
| Tab (batching) | 9,138 |
| Tab (no batching) | 10,000 |

Batching (see Power Automate: Batch Create SharePoint Items) often succeeds even if errors occur, which may be missed unless you create items one by one. Errors are often due to commas in values offsetting columns.

With batching some of the payload data may be lost for various reasons not ensuring data integrity with no data loss. Refer to an example with batching using PnP PowerShell depicting similar behaviour Update large list with PnP-Batch with retries to address throttling challenges
Conclusion
This flow automates importing tab-delimited data into SharePoint, ensuring only valid rows are added and files are organized after processing. Batching improves performance, but always monitor for missing records in large imports.
References
- Power Automate: Batch Create SharePoint Items
- Read Rows in Excel Using Power Automate
- Update large list with PnP-Batch with retries to address throttling challenges
- Limits of automated, scheduled, and instant flows