Vibhor Goel wrote an excellent post on using Visualforce to keep record data displayed in Flows in-sync with any changes made on a Lightning Record Page. He shows you how you can automatically update and refresh a Screen Flow when a record is updated. Check out the article here: https://www.accidentalcodersf.com/2020/09/keep-screen-flow-in-sync-with-record-page.html.
We have been posting about great things you can do with Flow and do in Lighting for quite some time now. When you click on Flow from the Home Page and then look at what is available to Extend Flows, you get to view and select from a number of different Flow Screen Components, Flow Actions and Process Components. We also have selections for Next Best Action, Lightning Page Components and External Services.
What was missing was a place to show you some of the great Applications and Utilities that you, the Administrator, can download and use to help you develop in and support your own orgs. Check out this new section often to see what might be there that you can put to use. Let us know in the comments about other Apps and Utilities that you would like to see highlighted on this page.
Many of the components on this site get updated on a regular basis with new enhancements. All you should need to do is follow the installation instructions to bring the component version up to date in your own orgs. Unfortunately, the methods and versions of the packaging tools we’ve used to produce these updated releases have changed over time. Things have finally settled down a bit and we are trying to be much more consistent in how the updates get packaged.
If you do run into issues there are a few things you can try. Unofficialsf.com contributor, Jack Pond, wrote this article a while back that may do the trick for you.
Another possible issue pops up when you have already created Flows that use the component you are trying to upgrade. Here’s an error you might see when trying to upgrade from v2.0 of the QuickChoice component:
This means you will have to Uninstall the current version before you can re-Install the new version. That’s OK if you haven’t already created Flows that use the component. What happens then is that you might get a message like this, “Unable to uninstall package” followed by one or more “Component is in use by another component in your organization. My_Flowname“.
If you are seeing this, you will need to first Delete any Flows that already use the component you are trying to update. Rather than having to recreate them from scratch, you can temporarily Export them then Import them back in after you have installed the updated component. Do this by first installing my Import/Export Flow Utility Flow.
The Import/Export Utility works with the most recent or active version of the Flow. You might also find that you have a number of older, inactive versions of the Flow that you have to delete as well. This can be a time-consuming manual process that can be avoided if you install my Flow and Process Builder List View with Batch Delete App.
It will be a fairly painless process once you have the right helper utilities installed in your org.
- Export your affected Flows
- Delete the inactive versions of the Flows
- Uninstall the old Component
- Install the new Component
- Import your saved Flows
There have been a lot of changes to what you have to do to let Guest Flows read and write data. Here’s a guide:
One thing I really like to see: Flow Components with price tags. That typically means customer support and solid, deep functionality. A recently available ‘pro’ component for your consideration: AddressTools from ProvenWorks. They already had a potent lightning component that checks the validity of addresses. They’ve now exposed that functionality via two new Flow Screen Components:
A really nice use case from Vibhor Goel: he pointed out that the common situation where a delete is prevented by related records can be addressed with a Before Delete flow like this one:
DetectAndLaunch has been updated to include an option to launch screen flows modally. It also has the ability to launch different flows depending on whether the record has been edited or deleted.
Import and Export Flows between Salesforce Orgs
Have you ever wanted to copy or move a Flow or Process Builder from one org to another without having to create a Change Set or rebuild it from scratch?
How about seeing a great Flow that you or someone else has created and would like to share?
Install this Flow in your org if you would like to Export or Import Flows and Process Builders.
Export a Flow
- Select Setup> Process Automation> Flows
- Open Import/Export Flows
- Run the Flow
- Select Export, choose your Flow and click Next
- You will see the export status while the Flow is being transferred
- A success message will display once the Flow has been exported
- The exported Flow is saved as a Salesforce File linked to your User record. To see your Files, click on the Waffle, type in Files and select Files.
- To download and save your Flow file so it can be shared, select the drop-down arrow and choose Download.
Import a Flow
- Select Setup> Process Automation> Flows
- Select Import/Export Flows
- Run the Flow
- Select Import then click on Upload Files
- Pick the Flow file from your computer’s file dialog box. Hint: The file names for Flows and Process Builders will end with .flow-meta.xml
- After the file has been uploaded, select Done then click Next
- You will see the status while the Flow is being imported and then deployed to the current org
- A success message will display once the Flow has been imported and deployed
NOTE: Objects, fields and referenced components must be available and compatible in the new org in order for the Flow to be deployed successfully.
9/3/20 – Eric Smith – Version 1.1
Updated the Flow Base Components (v1.2.6) to resolve an issue where some Flows would generate an error on Import or Export
9/1/20 – Eric Smith – Version 1.0
Ladies and gentlemen, I present to you what might be one of my favorite new actions: The Mass Transactor Action!
We at CapTech Consulting looked for the most impactful action for both our clients and the community, and decided it would be extremely powerful to enable Flow to run Creations, Updates, and Deletes of records asynchronously – AND specify a batch size.
Using this new invocable you will be able to transact records with little concern for limits – so long as you set your batch size correctly. Batch processing is specifically designed for large volumes of transactions – think of it as a ‘set it and forget it’ method of doing DML in Salesforce.
This action has been shown to create well over 10,000 records in a single Flow. That same flow, carrying out standard synchronous Create Record, failed at 300 records. That represents an increase of more than 33 times the transactional capacity of a typical DML action in Flow!
tl;dr (too long, didnt read) version: Async processing increases your limits. Use it when you want to do stuff to a lot of stuff and you don’t care exactly when the stuff gets done.
Most of what you work with in Salesforce are synchronous transactions. That means that when you do something to trigger the transaction, you will need wait until the operation is done. Asynchronous transactions work the opposite way: you send it off into the dark and hope everything works out okay. You can still determine if it succeeded, but it will take a little more work by querying the AsyncApexJob object using the outputted JobId.
I heavily recommend this Trailhead on Asynchronous processing (promise it isn’t bad): https://trailhead.salesforce.com/en/content/learn/modules/asynchronous_apex/async_apex_introduction
How it Works
- Pass this action a collection of records you want to create, update, or delete
- Decide if you want to change the default Batch Size, which is 200. This means that Salesforce will bucket 200 records into a transaction. If you want to process 5,000 records and your batch size is 200, Salesforce will effectively create 25 separate transactions for you – all with doubled SOQL/CPU limits! For ‘busy’ objects with a ton of code or automation, you’ll want to lower this if you have issues at 200.
- Run the flow
- If you want to track the progress of your operation, make note of the returned JobID, and use it at the Apex Jobs page in Setup or do a Get/Lookup in flow using the provided Job ID on the AsyncApexJob object and show the returned Job record’s fields on a screen for the user.
Major Features of the Invocable
- Create, Update, or Delete a record collection asynchronously from Flow and double your governor limits
- Creates a Batch job that lets you monitor its progress from the Apex Jobs setup page or lets you query the AsyncApexJob object using the output’d JobID. You can then display the status on a flow screen using results from a ‘Get’ on the AsyncApexJob object using the JobID as your identifier:
Real World Walkthrough
Stay tuned for a walkthrough of how this could be used to process THOUSANDS of records.
Batch Management & Notes
- Use in Triggers / Processes – Be very careful when using this in any process builder launched flows, triggers, or after-update flows. Batch jobs are not meant to be triggered hundreds of times.
- Batch Size – If your input collection is 2,000 records and your batch size is 500, it will create 4 batches. Each batch is considered a transaction to Salesforce. If in doubt, leave the batch size input blank – it will default to 200.
- Batch Size Management – Lower the batch size number if you have a lot of of code/automations and you need to ensure you dont hit Apex Limits. The only way of truly knowing the safe number is by testing – 200 is a great starting point. Keep in mind a lower number will mean the job will take longer complete, but will ensure you don’t hit limits.
- For Creates – Getting the record Ids created by the job is not currently possible from within Flow without some extra work.
- Job Limits – An org can only have but so many batches going at once – your job will be queued and/or put in the ‘Apex Flex Queue’ for processing – https://developer.salesforce.com/docs/atlas.en-us.224.0.apexcode.meta/apexcode/apex_batch_interface.htm
- Empty / Null Checks – Ensure you’re doing a null/empty check on whatever collection you pass in before running the action! There’s no use creating empty jobs to clog up the Apex Jobs list.
- Operation Type (Text) Valid options are Create, Update, or Delete
- Input Collection (Record Collection) Can be any sObject/Record collection from Flow
- Note that if you want to Update or Delete records, an Id will need to be present in the collection. It doesn’t support specifying an External Id. Yet.
- Batch Size (Optional) (Number) Specify how many records you want Salesforce to process at once.
- Ranges from 1 – 2000. Default is 200.
- THERE IS A MAX OF 2,000 RECORDS PER BATCH! This is a Salesforce imposed limit and this will throw an error if you try and go above it.
- This is where the magic happens – lower this number if you have a ton of code/automations and you need to ensure you don’t hit Apex Limits.
- Finish Notification Email Address (Optional) (Text) A single email address to send the finish email to
- Body of the email (Optional) (Text) What the email will contain upon completion
- Subject of the email (Optional) (Text) Subject of the email upon completion
- Success True or False (Boolean/Checkbox) Set to true if the batch was successfully submitted to the queue
- Batch JobId (Text) The Job ID created for this batch – Use this to query the AsyncApexJob object in Flow to get the latest results. Check out https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_objects_asyncapexjob.htm for a rundown of the fields available.
- Error Message (Text) If the job didn’t create successfully this will be populated
Download / Source
Unmanaged package (contains 4 Apex Classes): https://test.salesforce.com/packaging/installPackage.apexp?p0=04t3h000004m6N7
Unlocked Package Link: https://test.salesforce.com/packaging/installPackage.apexp?p0=04t3h000004m6NHAAY
Now available in the FlowComponents repo at: https://github.com/alexed1/LightningFlowComponents/tree/master/flow_action_components/MassTransactor
Future Improvement Ideas
- We’d love to build in more details about the success of the Job – potentially tying this with a custom platform event that will output a string of IDs created or an event that a Flow can listen for and act on.
- Provide for a success/failure email using either an email template ID or a flow text template
- Building in Upsert functionality
- Create a separate action that allows you to cancel a job if needed
- Provide a SOQL query to do your Update/Delete – this is handy if your collection exceeds the SOQL limit of 50,000 records.
The DateMatcher action lets you achieve more precise scheduling of your Schedule-triggered Flows, enabling you to run Flows that activate on ‘the third Wednesday each month’ or the ‘the 23rd of August, each year’.
Here’s how you configure it:
Note the use of a Custom Property Editor.
The action compares the current date/time to the configuration you’ve provided, and simply returns a true or false based on whether the current day is a match. So the way to use Date Matcher is to create a Schedule-Triggered flow that triggers every day, and then have it always run DateMatcher to see if the current day matches your criteria:
Version 1.0.0 Unlocked 8/30/20