Plugins in Flatfile are reusable, modular components that extend the platform’s data import and processing capabilities. They provide pre-built functionality for common data transformation tasks, integrations, and workflows, enabling rapid development of sophisticated import solutions.
What are Plugins?
A Plugin is a self-contained module that:
- Extends Functionality - Adds new capabilities to your import workflows
- Encapsulates Logic - Packages complex operations into reusable components
- Provides Integration - Connects to external systems and services
- Enables Customization - Allows configuration for specific use cases
Plugins operate within Apps and can be used by Listeners to perform specialized tasks without writing custom code.
Plugin Architecture
Plugin Structure
// Basic plugin structure
export const myPlugin = (config = {}) => {
return {
// Plugin metadata
name: "my-custom-plugin",
version: "1.0.0",
description: "A custom plugin for data processing",
// Plugin lifecycle hooks
onMount: async (context) => {
// Initialization logic
},
onRecord: async (record, context) => {
// Per-record processing
return record;
},
onBatch: async (records, context) => {
// Batch processing logic
return records;
},
onComplete: async (context) => {
// Cleanup and finalization
},
};
};
Plugin Configuration
import { myPlugin } from "@my-company/flatfile-plugin";
const configuredPlugin = myPlugin({
// Plugin-specific configuration
apiKey: process.env.EXTERNAL_API_KEY,
batchSize: 1000,
retryAttempts: 3,
// Custom options
transformRules: [
{ field: "name", transform: "uppercase" },
{ field: "email", transform: "lowercase" },
],
});
Core Plugins
Record Hook Plugin
Applies custom logic to individual Records:
import { recordHook } from "@flatfile/plugin-record-hook";
// Basic record transformation
const customerRecordHook = recordHook("customers", (record) => {
// Transform data
record.set(
"fullName",
`${record.get("firstName")} ${record.get("lastName")}`
);
// Validate email
const email = record.get("email");
if (email && !email.includes("@")) {
record.addError("email", "Invalid email format");
}
// Add computed fields
record.set("emailDomain", email?.split("@")[1]);
record.set("processedAt", new Date().toISOString());
return record;
});
// Advanced record hook with async operations
const enrichmentHook = recordHook("customers", async (record, context) => {
const email = record.get("email");
if (email) {
try {
// Enrich with external data
const enrichedData = await enrichmentAPI.lookup(email);
record.set("company", enrichedData.company);
record.set("jobTitle", enrichedData.jobTitle);
record.set("industry", enrichedData.industry);
} catch (error) {
record.addWarning("enrichment", "Could not enrich data for this record");
}
}
return record;
});
Automap Plugin
Automatically maps columns from Source files to Fields in a Blueprint
import { automap } from "@flatfile/plugin-automap";
// Basic automap configuration
const automapPlugin = automap({
accuracy: "confident", // 'strict', 'confident', 'loose'
matchStrategy: "fuzzy", // 'exact', 'fuzzy', 'ai'
// Custom mapping rules
customMappings: {
"First Name": "firstName",
"Last Name": "lastName",
"Email Address": "email",
"Phone Number": "phone",
},
// Field transformation during mapping
transforms: {
email: (value) => value?.toLowerCase().trim(),
phone: (value) => value?.replace(/\D/g, ""), // Remove non-digits
name: (value) => value?.replace(/\s+/g, " ").trim(),
},
});
// Advanced automap with AI-powered suggestions
const aiAutomapPlugin = automap({
accuracy: "confident",
matchStrategy: "ai",
// AI configuration
ai: {
model: "gpt-4",
contextPrompt: "Map customer data fields for CRM import",
confidenceThreshold: 0.8,
},
// Fallback mappings
fallback: {
strategy: "manual",
requireConfirmation: true,
},
});
Handles various file formats and delimiters:
import { delimiterExtractor } from "@flatfile/plugin-delimiter-extractor";
const extractorPlugin = delimiterExtractor({
// Supported formats
formats: ["csv", "tsv", "pipe", "custom"],
// Auto-detection settings
autoDetect: {
delimiter: true,
encoding: true,
headers: true,
},
// Custom delimiter configuration
customDelimiter: {
separator: "|",
quote: '"',
escape: "\\",
lineTerminator: "\n",
},
// Data cleaning options
cleanup: {
trimWhitespace: true,
removeEmptyRows: true,
normalizeLineEndings: true,
},
// Large file handling
streaming: {
enabled: true,
chunkSize: 10000,
maxFileSize: "100MB",
},
});
Validation Plugins
Comprehensive data validation capabilities:
import { validator } from "@flatfile/plugin-validator";
const validationPlugin = validator({
// Built-in validators
rules: {
email: {
validator: "email",
message: "Must be a valid email address",
},
phone: {
validator: "phone",
options: { country: "US" },
message: "Must be a valid US phone number",
},
required: {
validator: "required",
fields: ["firstName", "lastName", "email"],
},
},
// Custom validators
customValidators: {
businessEmail: (value) => {
const freeEmailDomains = ["gmail.com", "yahoo.com", "hotmail.com"];
const domain = value?.split("@")[1];
return !freeEmailDomains.includes(domain);
},
strongPassword: (value) => {
return (
value?.length >= 8 &&
/[A-Z]/.test(value) &&
/[a-z]/.test(value) &&
/\d/.test(value)
);
},
},
// Validation behavior
behavior: {
stopOnFirstError: false,
allowPartialValidation: true,
markInvalidRecords: true,
},
});
Integration Plugins
CRM Integration Plugins
Connect to popular CRM systems:
import { salesforcePlugin } from "@flatfile/plugin-salesforce";
import { hubspotPlugin } from "@flatfile/plugin-hubspot";
// Salesforce integration
const sfPlugin = salesforcePlugin({
connection: {
clientId: process.env.SF_CLIENT_ID,
clientSecret: process.env.SF_CLIENT_SECRET,
username: process.env.SF_USERNAME,
password: process.env.SF_PASSWORD,
sandbox: false,
},
// Object mapping
objectMapping: {
customers: "Contact",
companies: "Account",
},
// Field mapping
fieldMapping: {
firstName: "FirstName",
lastName: "LastName",
email: "Email",
company: "Account.Name",
},
// Sync options
sync: {
mode: "upsert", // 'insert', 'update', 'upsert'
matchField: "email",
batchSize: 200,
},
});
// HubSpot integration
const hubspotPlugin = hubspotPlugin({
apiKey: process.env.HUBSPOT_API_KEY,
// Portal configuration
portal: {
id: "your-portal-id",
domain: "your-domain.hubspot.com",
},
// Contact properties mapping
propertyMapping: {
firstName: "firstname",
lastName: "lastname",
email: "email",
company: "company",
},
});
Database Plugins
Direct database connectivity:
import { postgresPlugin } from "@flatfile/plugin-postgres";
import { mongoPlugin } from "@flatfile/plugin-mongo";
// PostgreSQL integration
const pgPlugin = postgresPlugin({
connection: {
host: process.env.PG_HOST,
port: 5432,
database: process.env.PG_DATABASE,
username: process.env.PG_USERNAME,
password: process.env.PG_PASSWORD,
ssl: true,
},
// Table mapping
tableMapping: {
customers: "public.customers",
orders: "public.orders",
},
// Query configuration
queries: {
upsert: true,
conflictResolution: "update",
returning: ["id", "created_at"],
},
});
// MongoDB integration
const mongoPlugin = mongoPlugin({
connectionString: process.env.MONGO_URI,
database: "customer_data",
// Collection mapping
collections: {
customers: "customers",
profiles: "user_profiles",
},
// Document options
options: {
upsert: true,
ordered: false,
writeConcern: { w: "majority" },
},
});
Custom Plugin Development
Creating a Custom Plugin
// plugins/data-enrichment.js
export const dataEnrichmentPlugin = (config = {}) => {
const {
apiKey,
enrichmentFields = ["company", "jobTitle"],
batchSize = 100,
} = config;
return {
name: "data-enrichment-plugin",
version: "1.0.0",
async onMount(context) {
// Initialize enrichment service
this.enrichmentService = new EnrichmentAPI({
apiKey: apiKey,
timeout: 10000,
});
console.log("Data enrichment plugin mounted");
},
async onBatch(records, context) {
const emailBatch = records
.map((record) => record.get("email"))
.filter((email) => email && email.includes("@"));
if (emailBatch.length === 0) return records;
try {
// Batch enrichment API call
const enrichedData = await this.enrichmentService.batchLookup(
emailBatch
);
// Apply enriched data to records
records.forEach((record) => {
const email = record.get("email");
const enrichment = enrichedData[email];
if (enrichment) {
enrichmentFields.forEach((field) => {
if (enrichment[field]) {
record.set(field, enrichment[field]);
}
});
record.set("enrichedAt", new Date().toISOString());
}
});
} catch (error) {
console.error("Enrichment failed:", error);
// Add warning to all records in batch
records.forEach((record) => {
record.addWarning(
"enrichment",
"Data enrichment temporarily unavailable"
);
});
}
return records;
},
async onComplete(context) {
// Cleanup and reporting
const stats = await this.enrichmentService.getStats();
console.log(
`Enrichment complete. Processed: ${stats.processed}, Enriched: ${stats.enriched}`
);
},
};
};
Plugin Testing
// plugins/data-enrichment.test.js
import { expect } from "chai";
import { createMockRecord, createMockContext } from "@flatfile/testing";
import { dataEnrichmentPlugin } from "./data-enrichment.js";
describe("Data Enrichment Plugin", () => {
let plugin;
beforeEach(() => {
plugin = dataEnrichmentPlugin({
apiKey: "test-key",
enrichmentFields: ["company", "jobTitle"],
});
});
it("should enrich records with company data", async () => {
const records = [
createMockRecord({ email: "john@acme.com" }),
createMockRecord({ email: "jane@example.org" }),
];
const context = createMockContext();
const enrichedRecords = await plugin.onBatch(records, context);
expect(enrichedRecords[0].get("company")).to.equal("Acme Corp");
expect(enrichedRecords[0].get("jobTitle")).to.equal("Software Engineer");
});
it("should handle enrichment API failures gracefully", async () => {
// Mock API failure
plugin.enrichmentService.batchLookup = () => {
throw new Error("API unavailable");
};
const records = [createMockRecord({ email: "test@example.com" })];
const context = createMockContext();
const result = await plugin.onBatch(records, context);
expect(result[0].getWarnings()).to.have.length(1);
expect(result[0].getWarnings()[0].message).to.include(
"enrichment temporarily unavailable"
);
});
});
Plugin Management
Plugin Discovery and Installation
// Install plugins via npm
npm install @flatfile/plugin-record-hook
npm install @flatfile/plugin-automap
npm install @my-company/custom-plugin
// Import and configure
import { recordHook } from '@flatfile/plugin-record-hook';
import { automap } from '@flatfile/plugin-automap';
import { customPlugin } from '@my-company/custom-plugin';
const plugins = [
automap({ accuracy: 'confident' }),
recordHook('customers', customerTransform),
customPlugin({ apiKey: process.env.API_KEY })
];
Plugin Versioning
// Specify plugin versions
const plugins = [
automap({ version: "^2.1.0", accuracy: "confident" }),
recordHook({ version: "~1.5.2" }, "customers", transform),
customPlugin({ version: "latest", apiKey: process.env.API_KEY }),
];
// Version compatibility checking
const compatibility = await checkPluginCompatibility(plugins);
if (!compatibility.isCompatible) {
console.warn("Plugin compatibility issues:", compatibility.conflicts);
}
// Optimize plugin performance
const optimizedPlugin = myPlugin({
// Caching configuration
cache: {
enabled: true,
ttl: 3600, // 1 hour
maxSize: 1000,
},
// Batch processing
batch: {
size: 1000,
timeout: 30000,
parallel: 4,
},
// Resource limits
resources: {
memory: "512MB",
timeout: 300000,
},
});
Plugin Monitoring
// Monitor plugin performance
const monitoredPlugin = withMonitoring(
myPlugin({
// Plugin config
}),
{
metrics: ["duration", "memory", "success_rate"],
alerts: {
slowExecution: { threshold: 5000 }, // 5 seconds
highMemory: { threshold: "256MB" },
lowSuccessRate: { threshold: 0.95 },
},
}
);
Best Practices
Plugin Design
- Keep plugins focused on single responsibilities
- Design for reusability across different projects
- Implement proper error handling and recovery
- Use configuration objects for flexibility
- Implement efficient batch processing
- Use caching for expensive operations
- Set appropriate resource limits
- Monitor plugin performance metrics
Security
- Validate all input data and configuration
- Use secure credential management
- Implement proper access controls
- Audit plugin operations
Testing
- Write comprehensive unit tests
- Test with various data scenarios
- Validate error handling paths
- Performance test with large datasets
- Apps - Applications that use plugins
- Listeners - Event handlers that can invoke plugins
- Records - Data that plugins process
- Workbooks - Data containers that plugins operate on
- Blueprints - Data structures that plugins work with
- Fields - Individual data elements that plugins transform
Plugins provide the building blocks for extending Flatfile’s capabilities, enabling you to create sophisticated data import solutions without writing complex custom code from scratch.