Skip to main content

SuiteScript Map/Reduce: Patterns That Actually Work in Production

Map/Reduce scripts are one of those SuiteScript features that look amazing on paper and then immediately get misused in the wild. I’ve seen Map/Reduce scripts that do everything in map, ignore reduce entirely, or blow through governance because they treat it like a scheduled script with extra steps.

Let’s talk about a few real-world patterns that make Map/Reduce scripts easier to reason about, easier to debug, and much more reliable in production.

This isn’t a deep dive into “what is Map/Reduce” — this is about how to actually use it day-to-day without hating your past self.


First: Stop Treating Map/Reduce Like a Scheduled Script

If your Map/Reduce script looks like this:

function getInputData() {
  return search.create({...});
}


function map(context) {
  // Load record
  // Modify record
  // Save record
} 

You’re not wrong, but you’re missing most of the benefits.

Map/Reduce shines when:

  • You separate read logic from write logic
  • You group work intentionally
  • You let NetSuite handle retries and governance

If everything lives in map, you’re basically just getting automatic rescheduling — which is fine, but we can do better.


Pattern #1: map = Normalize, reduce = Mutate

The cleanest mental model I’ve found:

  • getInputData: fetch identifiers
  • map: normalize data into a predictable shape
  • reduce: do the actual record updates
  • summarize: report, don’t repair

Example: Updating Transactions by Customer

getInputData
Return a saved search that gives you just enough data to do the job.

function getInputData() {
  return search.create({
    type: 'transaction',
    filters: [['mainline', 'is', 'T']],
    columns: ['internalid', 'entity']
  });
}

map
Group transactions by customer.

function map(context) {
  var result = JSON.parse(context.value);

  context.write({

    key: result.values.entity.value,
    value: result.id
  });

}

The key thing here: no record loading, no saving.
Just shaping data.


reduce
Now you do the expensive stuff once per customer, not once per transaction.

function reduce(context) {
  var customerId = context.key;
  var transactionIds = context.values;

  transactionIds.forEach(function (id) {

    var rec = record.load({
      type: record.Type.SALES_ORDER,
      id: id
    });

    rec.setValue({

      fieldId: 'custbody_processed',
      value: true
    });

    rec.save();

  });
}

This pattern alone can cut governance usage dramatically.


Pattern #2: Don’t Be Afraid to Use Reduce for Single Records

A common misconception is that reduce is only useful when grouping many values. That’s not true.

If you want:

  • Better retry behavior
  • Cleaner separation of concerns
  • Easier error tracking

…it’s often better to push everything into reduce, even if there’s only one value per key.

Example:

map(context) {
  context.write({
    key: context.key,
    value: context.value
  });

}

Why bother?

Because reduce failures are isolated.
If one record fails, the others still finish cleanly.


Pattern #3: Treat Summarize as a Report, Not a Fixer

I’ve seen summarize functions that try to:

  • Reload failed records
  • Fix partial data
  • Rerun business logic

Don’t do this.

summarize is for:

  • Logging totals
  • Reporting errors
  • Sending notifications

Good summarize example

function summarize(summary) {
  if (summary.inputSummary.error) {
    log.error('Input Error', summary.inputSummary.error);
  }

  summary.reduceSummary.errors.iterator().each(function (key, error) {
    log.error('Reduce error for key ' + key, error);
    return true;
  });
}

If you need recovery logic, build another script, not a clever summarize hack.


Pattern #4: Always Log with Context

When Map/Reduce scripts fail, debugging can be painful unless you log intentionally.

Bad log

log.error('Error', e);

Better log:

log.error({
  title: 'Reduce failed for customer ' + customerId,
  details: e

});

Remember:

  • Logs are often read days or weeks later
  • You won’t remember what “Error saving record” meant at the time

Future you will thank present you for the extra context.


Pattern #5: Saved Searches > Ad‑Hoc Record Loads

If your Map/Reduce script:

  • Loads records just to read fields
  • Pulls child data inside loops
  • Re-runs the same searches per record

…it’s probably doing too much work.

Where possible:

  • Push data selection into getInputData
  • Use summary columns
  • Let the search engine do the heavy lifting

As a rule of thumb:

Search early, load late.


Final Thoughts

Map/Reduce isn’t about “processing lots of records” — it’s about processing them well.

If you:

  • Keep map lightweight
  • Use reduce intentionally
  • Log like a responsible adult
  • Let searches do the work

…your scripts will scale better, fail less often, and be much easier to maintain.

And the next developer to touch your code (which might be you in six months) won’t silently curse your name.

Comments

Popular posts from this blog

Turning Saved Search Logic Into Insight: A Lesson in NetSuite Date Math

  Sometimes it's the smallest slice of a project that demands the most ingenuity. What seemed like a simple enhancement to a Saved Search ended up stretching my understanding of NetSuite’s formula logic—and my patience. 🔍 The Objective I needed to compare two types of project actions occurring within a 30-day window. Here’s the twist: Project Actions are a custom child record of a NetSuite Project (Job) , and they’re not stored in a way that makes direct comparison easy. So, I had to retrieve them through a Project Search , pulling in individual Action records via joins—first hurdle cleared. 📅 Date Math, NetSuite Style Next came the math. I needed to calculate whether one action happened within 30 days of another, but NetSuite doesn’t make date arithmetic feel intuitive. My first instinct was to use TO_NUMBER() to convert the dates into values I could compute on. Spoiler: it didn’t work. So I tried a workaround—subtracting a fixed reference date ( TO_DATE('01/01/2000...

NetSuite - Force Single Select on Multi-Select Field

Issue: We had a request from our Stakeholders to change a Multi-Select Field to a Single Select.  The Issue came up with this field being part of a bundle that we use.   We didn't want to deal with this being reverted or causing issues after we updated the bundle in the future, so I cam up with this work around. It is a Client Script that will throw an error to the user if they select more than one, and then it will reduce the selection to one option. /**  * @NApiVersion 2.x  * @NScriptType ClientScript  *  * 2025 Prevents multiple selections in a multi-select field * by showing an error message and removing the last selected value.  *  */ define ([ 'N/ui/message' , 'N/runtime' ], function ( message , runtime ) {     function fieldChanged ( context ) {         var scriptObj = runtime . getCurrentScript ();         var fieldName = scriptObj . getParameter ({       ...

Delta Cubes - The Dark Side of The Greendale

I'm a huge fan of Community ( @nbccommunity )  And I am also a huge fan of Pink Floyd.  So I decided to do a little mash-up after watching the last episode Season 04 Episode 07 - "Economics of Marine Biology"  And since I haven't posted in awhile, I thought I would Share.