Reporting in Workday

Why Your Workday Numbers Look “Wrong”

When HR and Finance say “these Workday numbers don’t look right,” the root cause is rarely Workday itself. In most tenants, the real problem is how reports are designed: wrong data sources, conflicting filters, noisy calculated fields, and security gaps that quietly hide rows.

When Stakeholders Don’t Trust Your Workday Reports

If you spend time in Workday reporting, you’ve probably heard some version of:

  • “This doesn’t match our spreadsheet.”
  • “Finance is showing a different total.”
  • “Can you just export it? We’ll fix it in Excel.”

It’s easy to blame Workday when numbers don’t line up. But across different tenants, the same issues appear again and again—and they’re almost always about report design, not platform capability.​

If your HR and Finance stakeholders regularly say the numbers “look wrong,” chances are you’re hitting one or more of these traps.

1. You Picked the Wrong Data Source

If the data source is wrong, everything built on top of it will be wrong too.

Common patterns:

  • Using a data source that only includes active employees when you actually need to see all workers, including terminated or contingent workers.
  • Mixing HR and Finance data sources with different grain (for example, worker-level vs transaction-level) and then trying to reconcile totals between them.
  • Starting from a niche data source when a broader, delivered one would have been safer.​

No amount of filters or calculated fields can fix a fundamentally mismatched source. Before building anything complex, confirm:

  • Does this source include the population you care about?
  • At what level does it store data (worker, position, event, line item)?
  • Is this the same lineage you’re comparing against elsewhere?

If the answer is no, start over with the right source rather than patching around the wrong one.

2. Your Filters and Prompts Are Fighting You

Even with the right data source, filters and prompts can quietly sabotage your results.

Typical issues:

  • Filters are applied on the wrong object; for example, filtering at a related level instead of on the primary business object.
  • Date or Effective Date isn’t prompted, so different users unknowingly compare different time windows.
  • Hidden or default filters (“Do Not Prompt at Runtime”) override what users think they’ve selected.​

That one checkbox—whether something is prompted or not—can completely change what appears on the report.

To debug:

  • Temporarily expose all key filters as prompts and run tests with very broad settings.
  • Check whether any underlying default filters are still restricting data, even when prompts seem open.
  • Verify that org, date, and status filters are applied where you actually expect them.

Until filters and prompts are aligned with the intended question, stakeholders will keep seeing inconsistent results.

3. Your Calculated Fields Are Double-Counting

Calculated fields are powerful, but they’re also an easy way to inflate or distort totals.

Common traps:

  • Aggregating at the wrong level when workers have multiple rows (for example, multiple positions, events, or dependents).
  • Building complex calculated fields, then reusing them everywhere without validating the logic in isolation.
  • Summing values in a matrix or pivot that already reflect aggregated or overlapping data.​

If your math is built on noisy grain, your totals will always look suspicious.

Safer patterns:

  • Start with a small debug report using just the fields and groups relevant to the calculation.
  • Confirm row-level behaviour: are you counting workers, events, assignments, or something else?
  • Only then roll the calculated field into your “official” reports and dashboards.

Once you understand the grain, your numbers become much easier to defend.

4. You’re Misusing Composite Reports

Composite Reports have a reputation as “advanced mode,” so many teams reach for them too early.

In reality, Composite is a specialised tool for:

  • Combining multiple report results or data sets into one “one-stop” output.
  • Handling multi-source, multi-period or multi-view scenarios that truly cannot be expressed in a single advanced report.​

When you use Composite as your default for anything complex, you usually add:

  • More joins and relationships than you really need.
  • More prompts and filters to coordinate.
  • More performance issues and more places where numbers can drift.​

If a single Advanced report (possibly surfaced on a dashboard) can answer the question, start there. Treat Composite as the last resort for genuinely complex needs—not as a badge of seniority.

5. Security Is Silently Hiding Rows

One of the most underestimated causes of “wrong” totals is security.

Scenario:

  • An admin runs a report and sees 1,254 workers.
  • A manager runs the same report, with seemingly identical prompts, and sees 1,037.
  • Both assume the other view is wrong.

In fact, both views might be correct—for their security.

Because Workday enforces security at the data level, reports will naturally show different populations based on:

  • Supervisory org visibility.
  • Role-based access and domain security.
  • Field-level security and constrained data sources.​

If you only test reports as an admin, you will miss how they behave for real users.

Always:

  • Test key reports as different roles (HR Partner, HRBP, Manager, Finance, etc.).
  • Clearly communicate which roles a report is designed for.
  • Document whether a report is meant to be tenant-wide, region-specific, or org-specific.

If you don’t account for security, you’ll spend hours chasing “wrong” numbers that are actually right for that user.

A Simple Playbook to Fix “Wrong” Numbers

When stakeholders don’t trust your reports, you don’t need a brand-new reporting strategy. You need a disciplined way to design and debug.

Here’s a practical playbook.

Start with the question, not the tool

  • Ask, “What decision will this report support?” before building anything.
  • Choose the simplest data source and an Advanced report that can answer that specific decision-making need.​

Strip the report back to basics

  • Remove non-essential filters, columns, and calculated fields.
  • Run the report with broad prompts (wide date ranges, minimal org restrictions).
  • Prove that the base data set is correct first; only then add complexity.

Test your prompts deliberately

  • Check that prompts are bound to the right underlying fields.
  • Make sure critical prompts—dates, orgs, populations—are present and, where appropriate, mandatory.
  • Look for “hidden” filters or defaults that override the user’s choices at runtime.​

Debug calculated fields in isolation

  • Build a tiny “debug” report that exists solely to test calculated fields and groupings.
  • Compare its totals against a trusted reference (manual count, legacy report, or a smaller population).
  • Only plug complex calcs into production reports when they match reality in isolation.

Apply this playbook consistently and two things will happen:

  1. Stakeholders will stop opening every report conversation with “These numbers look wrong.”
  2. You will become the person leaders rely on when they need numbers they can make decisions with.

And in Workday reporting, that trust is the most valuable metric you can own.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Prev
Workday Tenant Health 101
Workday Tenant Health

Workday Tenant Health 101

Performance, tech debt and post-go-live optimization guide

Next
Workday Report Naming Conventions
Workday Report Naming Conventions

Workday Report Naming Conventions

Why 'Report_Final_v3' Is Killing Your Team's Productivity

You May Also Like