Batchable Apex: The RevOps Playbook for Salesforce Automation

Replace unreliable Salesforce scheduled flows with Batchable Apex — governor limits, test classes, and a go-live checklist for RevOps teams.

When Should You Use Batch Apex Instead of Scheduled Flows?

Before writing code, we must understand the why and the what.

Implement this Batch Apex solution when clients experience any of the following:

Scheduled Flow Failures — Existing scheduled flows are failing, timing out, or proving unreliable. This is common when flows become complicated or touch a large volume of records. At RevBlack, we have outlawed scheduled flows and always use Batchable Apex instead.

High Volume Processing — Large volumes of records need to be processed on a schedule, which risks hitting Apex CPU or transaction limits in standard flows.

Static Records — Records need to be processed but don't naturally change on their own (e.g., formula updates or time-based maturity), meaning standard record-triggered flows won't fire.

Dealing with unreliable scheduled flows or Apex CPU timeouts in your Salesforce org?

BOOK A FREE CRM AUDIT

Which KPIs Does Batch Apex Impact?

Batch Apex projects should be connected to tangible business outcomes:

  • Automation Reliability — Replacing unreliable automations with a system that runs successfully every time
  • Operational Efficiency — Less time spent debugging or dealing with broken scheduled flows, reducing the cost of maintenance
  • System Uptime — Preventing system-wide slowdowns or lockouts caused by resource-heavy flows running during business hours
  • Data Consistency — Ensuring records are updated predictably, regardless of volume

What Questions Should You Ask Before Building?

Ask these questions to define the scope before building:

  • Logic and Frequency — What specific logic needs to run? How often should it run (daily, weekly)?
  • Volume — How many records could this touch during each run?
  • Current State — What record-triggered flows already exist on this object?
  • Exclusions — Are there any flows that should specifically not run when this batch updates the records?

What Is the RevBlack Standard for Batch Apex?

  • Simple — Keep the Apex code minimal. It should only handle the scheduling and the "trigger" update. Business logic remains in Flows.
  • Reliable — The system must handle large volumes without hitting governor limits.
  • Maintainable — Future admins should be able to update business logic in the Flow without needing to touch the Apex code.

What Are the Best Practices for Building Batch Apex?

Plan Before You Build.Audit existing flows: You must audit all other flows on the target object and ensure that the entry criteria only allows the needed records to enter and will not get triggered by each Batch Apex run.

Safeguards.

The "Trigger" Field: Create a specific field for this process. This field will be updated every time the Batch Apex runs and will be the trigger for record-triggered flows.

Recommendation: Do not use generic names like Updated__c which might be confused with system fields. Use something distinct like Last_Batch_Update__c (Date/Time).

Safe Flow Entry Criteria: This ensures that only the target flow will be triggered by the batch update.

  • The Target Flow — Set entry criteria to fire only when Last_Batch_Update__c Is Changed
  • All Other Flows — All other flows on the target object should have solid entry criteria. If more security is needed, update entry criteria on those flows to Last_Batch_Update__c is changed equals false. This ensures that the update populating the trigger field will not trigger those flows.

Batch Size: If the flow logic is heavy (e.g., callouts or complex subflows), you may need to reduce the batch size (default is 200) to avoid CPU timeout errors.

How Do You Build the Batch Apex System Step by Step?

Step 1 — The Batch Apex Class (The Engine)

This class performs the heavy lifting. It queries the records and stamps the trigger field.

Use the Configuration Table below to customize the template code for your specific project.

Placeholder Highlighted Example in Code What You Should Put
Class Name BatchUpdate [Object]BatchUpdate, e.g., InvoiceBatchUpdate
Target Object Account The API Name of the object you are automating (e.g., Invoices__c)
Trigger Field Last_Batch_Update__c The API Name of the custom field you created
BatchUpdate.cls Batch Apex Class — The Engine
global class BatchUpdate implements Database.Batchable<sObject> {

  // 1. START: Collect the records to be processed
  global Database.QueryLocator start(Database.BatchableContext bc) {
    // Replace 'Account' with your Target Object API Name
    return Database.getQueryLocator([
      SELECT Id, Last_Batch_Update__c
      FROM Account
    ]);
  }

  // 2. EXECUTE: Process the records in chunks
  global void execute(Database.BatchableContext bc, List<Account> scope) {
    for (Account record : scope) {
      // This change triggers the Record-Triggered Flow
      record.Last_Batch_Update__c = Date.today();
    }
    if (!scope.isEmpty()) {
      // 'false' = Partial Success — one failure won't stop the rest
      Database.update(scope, false);
    }
  }

  // 3. FINISH: Post-processing (usually left empty)
  global void finish(Database.BatchableContext bc) { }
}

Step 2 — The Schedulable Class (The Timer)

You cannot schedule a Batch class directly. You need a separate Schedulable class to act as the trigger.

BatchUpdateScheduler.cls Schedulable Class — The Timer
global class BatchUpdateScheduler implements Schedulable {
  global void execute(SchedulableContext sc) {
    BatchUpdate batch = new BatchUpdate();
    // Reduce to 100 or 50 if you hit Apex CPU Time limits
    Database.executeBatch(batch, 200);
  }
}

Step 3 — The Test Class (The Safety Net)

Requirement: You cannot deploy to Production without 75% code coverage. This test class ensures that your main Apex class runs as planned.

BatchUpdateTest.cls Test Class — The Safety Net
@isTest
private class BatchUpdateTest {
  @isTest
  static void testBatch() {
    // 1. SETUP: Create dummy data
    Account testRecord = new Account(Name = 'Test Account');
    insert testRecord;

    // 2. EXECUTE: Run the scheduler
    Test.startTest();
    String cronExp = '0 0 0 15 3 ? 2099';
    System.schedule('Test Batch Update', cronExp, new BatchUpdateScheduler());
    Test.stopTest();

    // 3. ASSERT: Verify the field was updated
    Account updated = [SELECT Last_Batch_Update__c FROM Account WHERE Id = :testRecord.Id];
    System.assertEquals(Date.today(), updated.Last_Batch_Update__c,
      'The trigger field should be updated to today.');
  }
}

Step 4 — Scheduling the Job

Once the code is deployed to Production, you must manually schedule it to run.

  1. Navigate to Setup
  2. In the Quick Find box, type Apex Classes
  3. Click the Schedule Apex button
  4. Job Name — Enter a descriptive name (e.g., Daily InvoiceBatch Update)
  5. Apex Class — Click the lookup (magnifying glass) and select BatchUpdateScheduler
  6. Frequency:
    • Select Weekly
    • Check the boxes for every day of the week (Sunday–Saturday) to run daily
  7. Start Time — Select the preferred time (e.g., 1:00 AM) to minimize impact on business hours
  8. Click Save

Step 5 — Create the Record-Triggered Flow (The Logic)

The final step is to build the Flow that performs the business logic. This is where the work happens.

  1. Navigate to Setup → Flows and click New Flow
  2. Select Record-Triggered Flow and click Create
  3. Object — Select your Target Object (e.g., Invoice)
  4. Trigger the Flow When — Select A record is updated
  5. Set Entry Conditions:
    • Condition Requirements: All Conditions Are Met (AND)
    • Field: Last_Batch_Update__c
    • Operator: Is Changed
    • Value: {!$GlobalConstant.True}
  6. Build Your Logic — Add your Decision, Update, Email, or Create elements here

Note: This flow will now run automatically for every single record processed by the batch job.

What Are the Common Challenges and How Do You Fix Them?

People and Client Challenges

Challenge Cause Solution
"Custom Code" Fear Clients worry about maintaining code Explain that the code is minimal and static; the logic lives in Flows which they can own.
Maintenance Future admins need Apex knowledge to change the query Document the query clearly. Ensure the code is simple enough that minor query changes are trivial.

Technical Challenges

Challenge Cause Solution
Unintended Triggers Other flows fire on every batch update Strictly audit entry criteria of all flows on the object. Use Last_Batch_Update__c IS CHANGED as a gate. Non-target flows should use Last_Batch_Update__c IS CHANGED = FALSE if entry criteria cannot be changed.
CPU Timeouts The flow triggered by the batch is too complex Reduce the batch size in the Apex Database.executeBatch call (e.g., to 100 or 50).
Confusion Unclear where logic lives (Apex vs. Flow) Use strict naming conventions and documentation to indicate the Flow is "Batch Triggered".

Go-Live Checklist

Use this checklist before scheduling the batch in production.

Pre-Deployment (Sandbox)

  • Flow Audit: Verified that other flows on the object will NOT fire when the batch runs
  • Test Class: Written and passed with adequate coverage
  • Volume Test: Ran the batch with a realistic data volume to check for governor limits

Deployment

  • Field Deployment: Last_Batch_Update__c deployed
  • Code Deployment: Batch and Schedulable classes deployed
  • Flow Activation: Record-triggered flows activated

Post-Deployment (Production)

  • Schedule: Schedule the Apex job via Developer Console or UI
  • Monitor: Check "Apex Jobs" specifically for the first few runs to ensure no failures

BOOK A CALL
Guides

Don't miss these

Get started with revblack today

Ready to see these results for your business?

Fill out form