Create, Delete, or Update Tens of Thousands of records at a time in Flow with the new Mass Transactor Action!
Ladies and gentlemen, I present to you what might be one of my favorite new actions: The Mass Transactor Action!
We at CapTech Consulting looked for the most impactful action for both our clients and the community, and decided it would be extremely powerful to enable Flow to run Creations, Updates, and Deletes of records asynchronously – AND specify a batch size.
Using this new invocable you will be able to transact records with little concern for limits – so long as you set your batch size correctly. Batch processing is specifically designed for large volumes of transactions – think of it as a ‘set it and forget it’ method of doing DML in Salesforce.
This action has been shown to create well over 10,000 records in a single Flow. That same flow, carrying out standard synchronous Create Record, failed at 300 records. That represents an increase of more than 33 times the transactional capacity of a typical DML action in Flow!
What’s ‘Asynchronous’?
tl;dr (too long, didnt read) version: Async processing increases your limits. Use it when you want to do stuff to a lot of stuff and you don’t care exactly when the stuff gets done.
Most of what you work with in Salesforce are synchronous transactions. That means that when you do something to trigger the transaction, you will need wait until the operation is done. Asynchronous transactions work the opposite way: you send it off into the dark and hope everything works out okay. You can still determine if it succeeded, but it will take a little more work by querying the AsyncApexJob object using the outputted JobId.
I heavily recommend this Trailhead on Asynchronous processing (promise it isn’t bad): https://trailhead.salesforce.com/en/content/learn/modules/asynchronous_apex/async_apex_introduction
How it Works
- Pass this action a collection of records you want to create, update, or delete
- Decide if you want to change the default Batch Size, which is 200. This means that Salesforce will bucket 200 records into a transaction. If you want to process 5,000 records and your batch size is 200, Salesforce will effectively create 25 separate transactions for you – all with doubled SOQL/CPU limits! For ‘busy’ objects with a ton of code or automation, you’ll want to lower this if you have issues at 200.
- Run the flow
- If you want to track the progress of your operation, make note of the returned JobID, and use it at the Apex Jobs page in Setup or do a Get/Lookup in flow using the provided Job ID on the AsyncApexJob object and show the returned Job record’s fields on a screen for the user.
Major Features of the Invocable
- Create, Update, or Delete a record collection asynchronously from Flow and double your governor limits
- Creates a Batch job that lets you monitor its progress from the Apex Jobs setup page or lets you query the AsyncApexJob object using the output’d JobID. You can then display the status on a flow screen using results from a ‘Get’ on the AsyncApexJob object using the JobID as your identifier:

Real World Walkthrough
Stay tuned for a walkthrough of how this could be used to process THOUSANDS of records.
Batch Management & Notes
- Use in Triggers / Processes – Be very careful when using this in any process builder launched flows, triggers, or after-update flows. Batch jobs are not meant to be triggered hundreds of times.
- Batch Size – If your input collection is 2,000 records and your batch size is 500, it will create 4 batches. Each batch is considered a transaction to Salesforce. If in doubt, leave the batch size input blank – it will default to 200.
- Batch Size Management – Lower the batch size number if you have a lot of of code/automations and you need to ensure you dont hit Apex Limits. The only way of truly knowing the safe number is by testing – 200 is a great starting point. Keep in mind a lower number will mean the job will take longer complete, but will ensure you don’t hit limits.
- For Creates – Getting the record Ids created by the job is not currently possible from within Flow without some extra work.
- Job Limits – An org can only have but so many batches going at once – your job will be queued and/or put in the ‘Apex Flex Queue’ for processing – https://developer.salesforce.com/docs/atlas.en-us.224.0.apexcode.meta/apexcode/apex_batch_interface.htm
- Empty / Null Checks – Ensure you’re doing a null/empty check on whatever collection you pass in before running the action! There’s no use creating empty jobs to clog up the Apex Jobs list.
Inputs
- Operation Type (Text) Valid options are Create, Update, or Delete
- Input Collection (Record Collection) Can be any sObject/Record collection from Flow
- Note that if you want to Update or Delete records, an Id will need to be present in the collection. It doesn’t support specifying an External Id. Yet.
- Batch Size (Optional) (Number) Specify how many records you want Salesforce to process at once.
- Ranges from 1 – 2000. Default is 200.
- THERE IS A MAX OF 2,000 RECORDS PER BATCH! This is a Salesforce imposed limit and this will throw an error if you try and go above it.
- This is where the magic happens – lower this number if you have a ton of code/automations and you need to ensure you don’t hit Apex Limits.
- Finish Notification Email Address (Optional) (Text) A single email address to send the finish email to
- Body of the email (Optional) (Text) What the email will contain upon completion
- Subject of the email (Optional) (Text) Subject of the email upon completion
Outputs
- Success True or False (Boolean/Checkbox) Set to true if the batch was successfully submitted to the queue
- Batch JobId (Text) The Job ID created for this batch – Use this to query the AsyncApexJob object in Flow to get the latest results. Check out https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_objects_asyncapexjob.htm for a rundown of the fields available.
- Error Message (Text) If the job didn’t create successfully this will be populated
Download / Source
NOTE: Keep in mind this is an UNMANAGED package, which means it cannot be upgraded in the future. I would suggest installing in a sandbox and deploying the apex classes together when migrating to production.
Unmanaged package (contains 4 Apex Classes): https://test.salesforce.com/packaging/installPackage.apexp?p0=04t3h000004m6N7
Unlocked Package Links:
production: https://login.salesforce.com/packaging/installPackage.apexp?p0=04t3h000004m6NHAAY
sandbox: https://test.salesforce.com/packaging/installPackage.apexp?p0=04t3h000004m6NHAAY
Source:
Now available in the FlowComponents repo at: https://github.com/alexed1/LightningFlowComponents/tree/master/flow_action_components/MassTransactor
Future Improvement Ideas
- We’d love to build in more details about the success of the Job – potentially tying this with a custom platform event that will output a string of IDs created or an event that a Flow can listen for and act on.
- Provide for a success/failure email using either an email template ID or a flow text template
- Building in Upsert functionality
- Create a separate action that allows you to cancel a job if needed
- Provide a SOQL query to do your Update/Delete – this is handy if your collection exceeds the SOQL limit of 50,000 records.