ACUMATICA

Acumatica Bulk Data Processing

February 3, 2024 16 min read

Introduction

Bulk data processing is essential for migrating data, performing mass updates, and integrating Acumatica with external systems. This guide covers various methods for handling large datasets efficiently in Acumatica ERP.

Import Methods

Acumatica provides multiple ways to import bulk data:

  • Data Import Export - Built-in wizard for CSV imports
  • REST API - Programmatic bulk operations
  • SQL Direct - Direct database operations (with caution)
  • Integration Scenarios - Using Acumatica's integration tools

REST API Bulk Operations

Using the REST API for bulk operations provides the most flexibility:

// Bulk create customers via REST API
POST /api/data/AR303000
Content-Type: application/json

[
  {
    "CustomerID": { "value": "CUST001" },
    "CustomerName": { "value": "Customer One" },
    "CreditLimit": { "value": 10000 }
  },
  {
    "CustomerID": { "value": "CUST002" },
    "CustomerName": { "value": "Customer Two" },
    "CreditLimit": { "value": 15000 }
  }
]

Batch Processing

For processing large datasets, implement batch processing in your customizations:

public class BulkProcessHandler
{
    private const int BatchSize = 100;
    
    public void ProcessBulkUpdate(List<CustomerData> customers)
    {
        int processed = 0;
        var batch = new List<CustomerData>();
        
        foreach (var customer in customers)
        {
            batch.Add(customer);
            
            if (batch.Count >= BatchSize)
            {
                ProcessBatch(batch);
                processed += batch.Count;
                batch.Clear();
                
                // Update progress
                this.Graph.TimeStamp = DateTime.Now;
            }
        }
        
        // Process remaining
        if (batch.Count > 0)
        {
            ProcessBatch(batch);
        }
    }
    
    private void ProcessBatch(List<CustomerData> batch)
    {
        // Process each batch within a transaction
        using (var ts = new PXTransactionScope())
        {
            foreach (var data in batch)
            {
                UpdateCustomer(data);
            }
            ts.Complete();
        }
    }
}

Code Examples

Here's a complete example using the Acumatica API client:

class AcumaticaBulkProcessor
{
    private readonly AcumaticaApi _api;
    
    public async Task BulkImportInventory(List<InventoryItem> items)
    {
        const int batchSize = 50;
        var batches = items
            .Select((item, index) => new { item, index })
            .GroupBy(x => x.index / batchSize)
            .Select(g => g.Select(x => x.item).ToList());
        
        foreach (var batch in batches)
        {
            var commands = batch.Select(item => new Command[]
            {
                new Value { Value = item.InventoryID, LinkedCommand = InventoryItem.InventoryID },
                new Value { Value = item.Descr, LinkedCommand = InventoryItem.Descr },
                new Value { item.ItemStatus, LinkedCommand = InventoryItem.ItemStatus }
            }).ToArray();
            
            await _api.Submit(commands);
        }
    }
}

Best Practices

  • Use Batches - Process data in manageable chunks
  • Implement Logging - Track successful and failed records
  • Validate First - Validate data before processing
  • Use Transactions - Wrap operations in transactions for consistency
  • Monitor Performance - Track processing time and resource usage
  • Test Thoroughly - Always test with sample data first

Summary

Bulk data processing in Acumatica can be accomplished through various methods depending on your requirements. The REST API provides the most flexibility for programmatic solutions.

For more information, see our guides on Data Import/Export and REST API Integration.