Search

Digital Finance: Dashboard Reporting Designing your cloud-based system for usefulness

You know that thoughtful measurement and reporting is important and that the options for aggregating and displaying organizational information have multiplied exponentially during the past decade. It’s smart to get a handle on your current data management and reporting capacity from a technology standpoint before you launch into new reporting.

Start by reviewing existing data sources that feed your day-to-day, monthly, and annual operations and reporting. Get an understanding of the current state of manual and any automated business processes that already support your reporting and decision-making.  And, assess how well your current computer hardware and software is meeting your needs.

The growth of cloud-based, self-service technology tools — financial and otherwise — means that nonprofits of all sizes have better and more affordable options. There are a host of inexpensive, cloud-based business intelligence applications that might or might not be right for your organization. Benefits of moving from legacy systems to the cloud include:

• Real-time automation, with access to information from any location, including mobile devices;

• Fewer installation and start-up costs, scalable licensing options, and the ability to add users and applications as you grow;

• Better collaboration inside and outside your organization, including easy online file storage and sharing versus file sending;

• Access to the latest security and automatic backup technologies, giving you peace of mind in case of any need for disaster recovery; and,

• Reporting “out of the box” and ability to integrate with other financial and non-financial applications.

If you think your organization’s reporting could use refreshing, a helpful starting point is to then ask yourself three simple questions: Why? Who? And, So What?

Why: Know Why You Report

Start by answering the basic question of why you report items. Some common answers might be:

• Improving your ability to deliver programs, services, and products that fulfill your mission;

• Focusing the staff’s attention on the things that really matter;

• Because your board members, funders, customers, and an increasingly savvy public want information; and,

• Because, increasingly, if you don’t tell your own story someone else will tell it for you.

Who: Know Your Audience

The next question you need to ask yourself with any reporting endeavor is audience. Who are they and what do they care about? It can be a helpful exercise to inventory audiences. It could be internal — board chair, committee chairs, CEO, staff member, or volunteers. It could be external — potential beneficiary, peer organization, key partner, or the general public. Put yourself in their shoes and for each write down what they care about with respect to your organization.

So What: Measures that Matter

With the above backdrop in mind, now zoom in a little bit. As you look at what your audience cares about, sketch out your core reporting areas. Finance might be an obvious one. Other areas could be compliance, learning and development and impact. It’s up to you.

In shaping those reporting areas, it can be helpful to articulate an overarching purpose for each. Try to write just one sentence that succinctly expresses each reporting area’s importance to your organization’s ability to execute on its strategies and advance its mission. For example:

Finance: To maintain finances which support growth and expansion while maintaining a healthy infrastructure.

Internal Process: To create and maintain effective, compliant, and well-integrated systems that meet stakeholders’ needs.

Impact: To provide supportive services which help men, women, and children in need thrive today and be equipped to thrive tomorrow.

Learning and Growth: To attract and develop staff and board aligned with our guiding values and positioned to advance the mission.

Qualities of Good Measures

Now think about the reporting areas you just defined and brainstorm as many metrics as you can for each. Rate or rank them across a range, for example, ease of collection, stakeholder or audience relevance, or mission-alignment. But before you do, take a minute to consider what makes for a good measure or metric.

Look for those that are objective and unbiased, statistically reliable but inexpensive to collect, both qualitative and quantitative, where small changes are meaningful, that have identifiable influencers, and that have or set a standard of comparison.

Bringing it All Together

Once you know to whom you’re reporting and what you’d like to tell them, you can map out the needed data and determine which technologies might help host, aggregate, and generate your reports. You’ll want to consider whether the application you choose will integrate with your other data management needs, with financial applications to pay bills and process payroll, donor or member management, or simply your contacts.

Once you’ve prototyped your reporting tools, you can then map the data systems to those tools, documenting how data would flow from existing or new sources to the newly-developed tools.

Regardless of the tool you choose, when taken together as a whole it ought to both track progress and stimulate dialogue with your audience toward the fundamental question of: how are we doing?

Case In Point

A private foundation was a few years into new executive leadership and a new strategic plan, coming off of what was largely considered a decade of lessons hard-learned. There wasn’t anything particularly wrong with the strategic plan they were operating under except that it was almost entirely program-focused. It had to be given the need to pivot away from very public shortcomings of the foundation’s previous decade-long strategy.

A byproduct of that program-heavy plan was that it left the rest of the foundation in the background. It made it hard to see what else was important. It didn’t give the foundation a very holistic view of how the entire organization contributed to

You know that thoughtful measurement and reporting is important and that the options for aggregating and displaying organizational information have multiplied exponentially during the past decade. It’s smart to get a handle on your current data management and reporting capacity from a technology standpoint before you launch into new reporting.

Start by reviewing existing data sources that feed your day-to-day, monthly, and annual operations and reporting. Get an understanding of the current state of manual and any automated business processes that already support your reporting and decision-making.  And, assess how well your current computer hardware and software is meeting your needs.

The growth of cloud-based, self-service technology tools — financial and otherwise — means that nonprofits of all sizes have better and more affordable options. There are a host of inexpensive, cloud-based business intelligence applications that might or might not be right for your organization. Benefits of moving from legacy systems to the cloud include:

• Real-time automation, with access to information from any location, including mobile devices;

• Fewer installation and start-up costs, scalable licensing options, and the ability to add users and applications as you grow;

• Better collaboration inside and outside your organization, including easy online file storage and sharing versus file sending;

• Access to the latest security and automatic backup technologies, giving you peace of mind in case of any need for disaster recovery; and,

• Reporting “out of the box” and ability to integrate with other financial and non-financial applications.

If you think your organization’s reporting could use refreshing, a helpful starting point is to then ask yourself three simple questions: Why? Who? And, So What?

Why: Know Why You Report

Start by answering the basic question of why you report items. Some common answers might be:

• Improving your ability to deliver programs, services, and products that fulfill your mission;

• Focusing the staff’s attention on the things that really matter;

• Because your board members, funders, customers, and an increasingly savvy public want information; and,

• Because, increasingly, if you don’t tell your own story someone else will tell it for you.

Who: Know Your Audience

The next question you need to ask yourself with any reporting endeavor is audience. Who are they and what do they care about? It can be a helpful exercise to inventory audiences. It could be internal — board chair, committee chairs, CEO, staff member, or volunteers. It could be external — potential beneficiary, peer organization, key partner, or the general public. Put yourself in their shoes and for each write down what they care about with respect to your organization.

So What: Measures that Matter

With the above backdrop in mind, now zoom in a little bit. As you look at what your audience cares about, sketch out your core reporting areas. Finance might be an obvious one. Other areas could be compliance, learning and development and impact. It’s up to you.

In shaping those reporting areas, it can be helpful to articulate an overarching purpose for each. Try to write just one sentence that succinctly expresses each reporting area’s importance to your organization’s ability to execute on its strategies and advance its mission. For example:

Finance: To maintain finances which support growth and expansion while maintaining a healthy infrastructure.

Internal Process: To create and maintain effective, compliant, and well-integrated systems that meet stakeholders’ needs.

Impact: To provide supportive services which help men, women, and children in need thrive today and be equipped to thrive tomorrow.

Learning and Growth: To attract and develop staff and board aligned with our guiding values and positioned to advance the mission.

Qualities of Good Measures

Now think about the reporting areas you just defined and brainstorm as many metrics as you can for each. Rate or rank them across a range, for example, ease of collection, stakeholder or audience relevance, or mission-alignment. But before you do, take a minute to consider what makes for a good measure or metric.

Look for those that are objective and unbiased, statistically reliable but inexpensive to collect, both qualitative and quantitative, where small changes are meaningful, that have identifiable influencers, and that have or set a standard of comparison.

Bringing it All Together

Once you know to whom you’re reporting and what you’d like to tell them, you can map out the needed data and determine which technologies might help host, aggregate, and generate your reports. You’ll want to consider whether the application you choose will integrate with your other data management needs, with financial applications to pay bills and process payroll, donor or member management, or simply your contacts.

Once you’ve prototyped your reporting tools, you can then map the data systems to those tools, documenting how data would flow from existing or new sources to the newly-developed tools.

Regardless of the tool you choose, when taken together as a whole it ought to both track progress and stimulate dialogue with your audience toward the fundamental question of: how are we doing?

Case In Point

A private foundation was a few years into new executive leadership and a new strategic plan, coming off of what was largely considered a decade of lessons hard-learned. There wasn’t anything particularly wrong with the strategic plan they were operating under except that it was almost entirely program-focused. It had to be given the need to pivot away from very public shortcomings of the foundation’s previous decade-long strategy.

A byproduct of that program-heavy plan was that it left the rest of the foundation in the background. It made it hard to see what else was important. It didn’t give the foundation a very holistic view of how the entire organization contributed to advancing its mission.

Foundation managers set out to develop a balanced scorecard as a way to broaden the strategic plan so that the entire organization would be more robustly included and all staff could see how their efforts connected to the mission. This effort was led, overseen, and championed by the CEO and the board’s Executive Committee, a natural home given its composition of all other board committee chairs.

The executive team then drove its development within staff. The effort was both top-down and bottom-up. It was iterative. It was inclusive, though not a democracy. It took about 16 months from the first staff kick-off to the board’s final adoption of the scorecard.

Whether you choose to follow a similar path in your organization, here are just a few lessons learned along the way that can be applied to any reporting endeavor.

• Use precise language. Say what you mean and sharpen wherever you can.

• Don’t ask your scorecard to do too much. Keep it simple. Err on the side of stripping away instead of adding.

• Don’t seek perfection. The perfect time to begin will never exist and consensus will never be reached.

• Practicality is critical. It must be clear that the results will be used to inform better decision-making, not merely as an intellectual endeavor.

• Use it to reinforce a learning culture. Staff needs to understand that revealing negative findings will not be viewed as a “gotcha” exercise but rather as constructive information for improvement.  NPT

Ben Aase is a principal with CliftonLarsonAllen LLP, professional services firm providing business consulting, outsourcing, wealth advisory, and public accounting services to the nonprofit sector with offices around the country. His email address is [email protected]

advancing its mission.

Foundation managers set out to develop a balanced scorecard as a way to broaden the strategic plan so that the entire organization would be more robustly included and all staff could see how their efforts connected to the mission. This effort was led, overseen, and championed by the CEO and the board’s Executive Committee, a natural home given its composition of all other board committee chairs.

The executive team then drove its development within staff. The effort was both top-down and bottom-up. It was iterative. It was inclusive, though not a democracy. It took about 16 months from the first staff kick-off to the board’s final adoption of the scorecard.

Whether you choose to follow a similar path in your organization, here are just a few lessons learned along the way that can be applied to any reporting endeavor.

• Use precise language. Say what you mean and sharpen wherever you can.

• Don’t ask your scorecard to do too much. Keep it simple. Err on the side of stripping away instead of adding.

• Don’t seek perfection. The perfect time to begin will never exist and consensus will never be reached.

• Practicality is critical. It must be clear that the results will be used to inform better decision-making, not merely as an intellectual endeavor.

• Use it to reinforce a learning culture. Staff needs to understand that revealing negative findings will not be viewed as a “gotcha” exercise but rather as constructive information for improvement.  NPT

Ben Aase is a principal with CliftonLarsonAllen LLP, professional services firm providing business consulting, outsourcing, wealth advisory, and public accounting services to the nonprofit sector with offices around the country. His email address is [email protected]