Data exchange

5 must-have features for a data importer

Is your data importer as intuitive, fast, and accurate as it could be for your users? Does it require minimum maintenance from your in-house engineers so you can focus on building the product features your customers love?

If your answer to either of these questions was anything other than a resolute “yes”, then you might want to keep reading. In this post, we explore different methods for handling data import and introduce the concept of a pre-built data importer that requires no patchwork open source coding. Plus, we provide the top five features you should vet for when choosing a data importer for your web-based app.

What is a data importer?

A data importer helps a user upload offline data or data from another online source into an application database. This user might be a customer, an internal employee, or a customer success representative who is offering white-glove onboarding assistance to a customer.

Companies need to onboard all sorts of data, depending on the application accepting it. Uploaded data could be email subscribers, customer contacts, inventory data, SKUs, task lists, etc.

There are a few different ways data imports are handled today. 

  • Custom developed solution - Often engineering teams will custom build their own data importer using patchworked open source solutions. This can be very complicated; a comprehensive importer requires UI and application code to handle data normalization, parsing, and mapping. Building a solution in-house can be resource intensive and time consuming, often keeping engineers away from working on their core product features.

  • Pre-built data importer - Another method is to use a ready-to-go data importer from a third party vendor. This reduces the need for manual uploading while at the same time means your in-house engineers won’t spend months building it.

  • Mix of manual and productized - In some cases, software companies might use a mix of manual and productized data importing. For example, they may have a spreadsheet with templated columns and request that customers add their data to the spreadsheet before uploading it. 

  • Manual processes - And finally, there’s always the option of doing everything manually, which is more common than you might expect. With manual processes, we’re not just referring to uploading, but everything from data preparation and cleaning to getting the data into the target software application.

What are some challenges with data importing?

Where to begin? Importing data is always chock full of challenges. While there are endless technical difficulties, the core issues come down to three categories: the effect these challenges have on…

  • paying customers 

  • internal software users

  • and the engineering team

Let’s explore these issues a little further.

Time to value for customer

When it’s difficult for customers to import their data—either because of time-consuming manual processes or because a custom-built data importing feature is difficult to use—everyone loses. 

The customer loses because they can’t get value out of the product, and the business loses because this customer is more likely to churn… if they ever get started in the first place.

Efficiency for employees (in the case of internal tool usage)

Similarly, when the user is an employee of a company, and the software is something that was built or purchased for internal use, slow data imports can cause problems. Only in this case, the business isn’t losing money directly from churning customers, but rather indirectly from employees who are spending exorbitant amounts of time managing imports.

Work prioritization for the engineering team

On the flip side of the user perspective comes the engineering team. Data importers are incredibly time consuming to build, because not only are they technically complicated but also challenging to design intuitively, with so many points of failure.

There are seemingly endless error messages that engineers need to provide so that customers know exactly what went wrong with their data import and how they can fix the files. End user customers leave with more questions than answers when the data import UX isn’t elegantly crafted.

Many engineering teams piece together different open source solutions to make the build faster, but even still challenges abound. 

Open source has made it easier for people to build out homegrown solutions, but those open source libraries only address small, discrete parts of a data import workflow. For example, you might be able to find an open source library that does parsing of Excel files and nothing more, leaving your team to build out the rest.

Why you should consider a ready-made data importer

A pre-built data importer can save time for both customers and the engineering team. 

While custom-built solutions cost engineering time and may or may not solve customer problems (depending on how intuitively they’re built and how helpful the import error messages are), manual processes save big on engineering time but ultimately cost the customer time in a way that can negatively impact revenue. 

There’s a lot of money to be made in focusing on customer retention. For software companies, it’s impossible to prioritize retention without taking a good hard look at just how simple it is for users to import their critical data during the data onboarding process.  

That’s why a quality data importer is worth considering when it comes to customer retention and the overall customer experience. Consider that US companies as a whole lose $136 billion per year due to customers switching and increasing customer retention rates by 5% has the power to lift profits by 25% to 75%.

The elegant data import button is here

See it in action

Request a demo

Top data importer features to consider

Data importer features can number in the dozens. When you’re considering implementing a pre-built data importer for your software, you need to check for the most essential features first.

You’re looking for something that fits naturally within your product, that will be easy to use for customers, and that will drastically reduce the time it takes for data to be useful.

Let’s take a look at five must have features below.

1. Data importer feature: Data parsing

Data parsing is the process of taking an aggregation of information (in a file) and breaking it into discrete parts. Essentially, data parsing means the separation of data. 

Flatfile’s data onboarding platform not only handles data parsing, but offers intelligent parsing, so a user doesn't have to explicitly define how a file has been aggregated, such as, “This is a UTF-8 encoded CSV." Instead, the encoding is auto-detected from the file. 

You’ll want to look for a data parsing feature that not only provides the technicalities of going from a file to an array of discrete data but also streamlines this process for your customers.

2. Data importer feature: Data mapping

Mapping and matching often get used interchangeably, but either way, this usually refers to taking the previously unknown data that you started with and matching it to a known target. 

It’s an absolute requirement that your data importer do this very well. A good example is when you need to configure your data importer to accept contacts. Let’s say that one of the fields is “email.” The customer might choose a file where the field is labelled “email address.”

If you don’t have data mapping in the product experience, that example import will fail because “email address” does not equal “email,” and so the customer can’t complete the import.

3. Data importer feature: Data structuring

Most product teams have an application database and an API, and they want to make sure that the data can flow into the database seamlessly. The way you get data into your system matters, because you want to make sure that it’s labeled appropriately. APIs  expect a certain format of data, and will fail otherwise.

A data importer should be flexible and be able to plug right into however you best accept data in your application or database. Flatfile’s data onboarding platform offers a variety of ways to structure data that has been uploaded, mapped, and validated by users, meaning it can fit right into your preferred data flow.

4. Data importer feature: Data validation

Another important feature for a data importer is data validation, which checks if the data matches an expected format or value. 

Having this feature in place prevents issues from occurring down the line. For example, if special characters can’t be used within a certain template or other feature, then you don’t want customers to be able to import them. Otherwise, they’ll be frustrated when an error message pops up during use that should have appeared during the import stage. Without data validation, your customers might end up having to remove and re-upload data. That can lead to frustration, which of course leads to churn.

5. Data importer feature: Data transformation

Data transformation is another key feature. Data transformation means making changes to data as it's flowing into the system to meet an expected or desired value. Rather than sending data back to users with an error message, data transformation can make small, systematic tweaks to make data more usable. For example, when transferring a task list, prioritization data could be transformed into a different value, such as numbers instead of labels, or numbers that get rounded up. 

The data importer you choose should include all of these features while also plugging into your product seamlessly. When it's time to choose a data import solution, your goal is to not only save internal engineering resources but to also wow customers with an amazing data import experience.

Flatfile’s data onboarding platform is a data importer that you can drop into your web app in a matter of hours to help your users import data in seconds. If interested in learning more, sign up for free or request a demo.

Want to learn more about the Flatfile Data Exchange Platform?

See how it works