Blog

Tools for Nonprofits to Effectively Manage and Analyze Their Data

Kurt Voelker

Vice President, Business Strategy and Growth, Forum One

Not that long ago, powerful data management and analysis tools were often only accessible to the commercial sector. Thankfully, that’s changing. Today, there is a wealth of modern tools that make deep and meaningful data analysis more accessible to the nonprofit and social good sector.

I recently hosted a webinar that digs into the approaches and tools that Forum One thinks any modern nonprofit, association, foundation or government agency should be looking at in order to manage their data. This post provides a simplified overview of that more in-depth talk, as well as a quick reference guide to the tools and technologies that we discussed as important for any modern data management stack.

Watch the full webinar recording here: Preparing for DataViz: What It Takes to Manage and Analyze Your Data.

When people think about data, the concept of data visualization (aka., dataviz) is often the star that gets all the attention. But in many ways, dataviz is just the visible tip of an iceberg of data work. In order to efficiently and consistently produce high-quality data visualizations — whether in health, education, poverty, or any other important issue – mission-driven organizations need to first have platforms in place to manage and analyze their data.

When considering how to approach data management, a modern data platform, or “stack”, needs to do four things really well:

  1. Get the data
  2. Store the data
  3. Model the data
  4. Analyze the data

To follow these steps, many researchers and policy analysts have had to rely on outdated approaches, such as spreadsheets, shared drives, and internal databases of varying sophistication. The great news is that new approaches and affordable tools are changing how nonprofits can manage and analyze data throughout this process.

1. Tools to Help You ‘Get the Data’

The first step in managing data is consuming and transforming it. How can you get data into your platform for future analysis? In the old days, getting data out of one system and moving it into a data warehouse or central data store required ad-hoc scripting, couldn’t accommodate real-time connections and updates, and was really difficult to manage over time.

There are a number of products available today designed specifically to take the pain out this “extraction” of data. These “data-piping” tools allow you to connect to a myriad of cloud-based services and data sources, extract data in real-time, map and transform this data, and then move it to a cloud-based data store (which is the next “layer” in our modern data stack).

We recommend tools such as Alooma,  Segment, Fivetran, and Matillion in this first stage.

2. Tools to Help You ‘Store the Data’

Once you’ve got your data, you need to keep it somewhere in the cloud where you can continue to add additional or updated data as it comes in. Over the past 8 years, the commercial sector’s focus on big data and data processing has had a huge impact on the availability and affordability of cloud data warehousing. What used to be expensive and hard to set up and manage is now much easier and affordable.

Tools we recommend at this stage include Google Big Query, Amazon Redshift, and Snowflake.

3. Tools to Help You ‘Model the Data’

Once you’ve been able to centralize and store your data, you are now ready to put it into a format that’s going to allow you to start answering the questions your organization is asking. A modern data platform provides analysts at your organizations with tools that let you define data dimensions, measures, calculations and aggregates that can be shared by your entire organization for querying and exploring.

Tools we recommend at this stage include Looker, Superset, Mode, and Metabase.

4. Tools to Help You ‘Analyze the Data’

This is the step where you can really dig in, ask various questions of your data, and show and share insights. This ‘layer in the stack’ has a number of established known players (e.g., Tableau), but a number of ‘new breed’ tools are becoming very popular because of their ease of use, their web-native architectures, and their newer approaches to modeling and sharing access to data, dashboards, and visualizations within organizations.

Tools we recommend at this stage include Looker, Tableau, Domo, Superset, GoodData, Periscope Data, and Metabase.

More than ever before, the above tools are allowing a mission-driven organization to take hold of their data in a faster, easier and affordable way. Again, to learn more about how to approach data management and analysis, check out my recent webinar on “Preparing for DataViz: What It Takes to Manage and Analyze Your Data.

Get your free copy of the webinar

Request your copy of Kurt’s webinar on “Preparing for DataViz: What It Takes to Manage and Analyze Your Data” to get practical advice and tips on how to be more strategic and create impact from the data you own and collect.

Written By

Kurt Voelker

Vice President, Business Strategy and Growth, Forum One