# Platform
Source: https://flatfile.com/docs/changelog/platform
Notable additions and updates to the Flatfile platform
Enhanced agent discovery with new search capabilities. Users can now search for agents by either agent slug or agent ID directly from both the agents list view and the agents detail page.
Optimized performance of the getAgents endpoint through pagination implementation. This significantly improving response times and scalability when working with large collections of agents.
Records with readonly configurations are now handled differently in the
replaceRecords operation. A new option called excludeReadOnly has been
introduced to control whether readonly records or fields should be skipped or
replaced. If excludeReadOnly is set to true, records with a readonly config or
fields marked as readonly will not be replaced during the replaceRecords
operation. If excludeReadOnly is false or not provided, readonly records and
fields will be replaced as before.
The findAndReplaceRecords method in the WorkbooksService now checks if the field
being replaced is readonly and throws an error if excludeReadOnly is set to
true, preventing modification of readonly fields.
Several new tests have been added to validate the behavior of the replaceRecords
operation with readonly records and fields, covering scenarios with
excludeReadOnly set to true and false.
The SearchRecordsOptions and GetCellValuesOptions interfaces have been updated
to include the new excludeReadOnly option.
Example usage:
```typescript
// Replace non-readonly records or fields
const records = await wb.op.replaceRecords({
sheetId: wb.contacts.id,
options: {},
find: "John",
replace: "Jonathan",
field: { key: "firstName" },
legacyReplace: false,
actorId: UserId.make(),
});
// Skip readonly records or fields
const records = await wb.op.replaceRecords({
sheetId: wb.contacts.id,
options: { excludeReadOnly: true },
find: "John",
replace: "Jonathan",
field: { key: "firstName" },
legacyReplace: false,
actorId: UserId.make(),
});
```
Added language support for:
* Spanish (es-419)
* Malay (ms)
* Dutch (nl)
* Polish (pl)
* Swedish (sv)
* Thai (th)
* Chinese Traditional (zh-hant)
* Chinese Traditional Hong Kong (zh-hant-HK)
Namespace modification is now supported via API only. Other app fields remain
editable via the UI but the ability to edit a namespace has been removed
post-creation.
Fixes bug where long document titles would overlap the page header.
The order of imports from the `@flatfile/api` module has been updated. The `RecordUpdates` and `Success` types are now imported in a different order.
The implementation of the `clearColumn` method has been refactored. Instead of using the `map` and `filter` array methods, it now uses the `reduce` method to generate the `recordsUpdates` array. The new implementation checks if the field value exists and if the field is not read-only before clearing it. This change likely improves performance and readability.
The wording of the `info` and `outcome.message` properties in the `transitionJob` call has been updated from "Column was removed" to "Column was cleared". This change provides more accurate messaging about the action performed.
Additionally, the code now handles read-only fields correctly. If a field is marked as read-only in the `RecordConfig`, its value will not be cleared.
These changes improve the functionality, performance, and messaging of the `clearColumn` logic without introducing any breaking changes to the external interface.
The unique validation and record/cell validity handling has been improved to:
* properly escape field keys containing special characters like periods
* correctly apply the `error` state and appropriate error messages when duplicate values are encountered for fields with a unique constraint
* handle cases where all initially added records have invalid (duplicate) values for fields with a unique constraint
When clearing a column in a job operation, a snapshot of the sheet will be taken
before the column is cleared. This snapshot will be labeled with the specific
column key that is being cleared, providing a record of the sheet state before
the column clearing operation. The snapshot is taken by calling a new utility
function `takeSnapshotForJob`, which handles creating and labeling the snapshot
based on the job, sheet, and column details. This allows developers to review
the sheet state prior to the column clearing if needed.
The release adds the ability to mark a space as an app template and filter
spaces based on whether they are app templates or not.
This can be used with the new space-configure-from-template plugin to create new
spaces using an existing space as a template.
When using Autobuild for a new App, the space generated during Autobuild will
automatically be tagged as the template for that App.
We added the ability to sort enum fields by label, value, or ordinal within the enum field config. This will ensure that enums will be displayed in the chosen order in mapping and review table. The default sortBy is label.
The release introduces the ability to filter spaces by the associated app ID
when retrieving a list of spaces. A new query parameter `appId` has been added
to the `/v1/spaces` endpoint. Developers can now pass the `appId` query
parameter to retrieve only the spaces associated with the specified app. For
example, `?appId=us_app_123` will return only spaces linked to the app with ID
`us_app_123`. If an invalid or non-existent app ID is provided, an empty list
will be returned.
An error is now thrown if attempting to clear a readonly column in a sheet. This
prevents modifying data in readonly columns.
Example:
```
const field = {
key: 'ReadonlyColumn',
label: 'Readonly Column',
type: 'string',
readonly: true
}
```
Attempting to clear this column will throw an error: 'Column is readonly and
cannot be cleared'.
An error is also thrown if attempting to clear any column in a readonly sheet.
This prevents modifying data in readonly sheets.
Example:
```
const sheet = {
config: {
readonly: true,
fields: [...]
}
}
```
Attempting to clear a column in this sheet will throw an error: 'Sheet is
readonly. Column cannot be cleared'.
These changes ensure data integrity by preventing unintended modification of
readonly data.
The `clearColumnAction` now has a constraint added that checks if the column is enabled before allowing the action to be executed. This ensures that the action can only be performed on columns that are currently enabled, preventing any potential issues or errors that could arise from attempting to clear a disabled column.
The manual entry and file upload button shown when the workbook is empty have
has their max width updated so they can support more text.
Trailing empty records in CSV files will now be trimmed during file extraction.
Empty records in-between records with values will continue to be kept.
* Record order is now maintained when inserting records
* When using the Flatfile API to add records, the order of those records will
now be preserved keeping the original sequence as provided
Fixes a bug where AI assist would crash when ‘config’ object or the ‘options’
array is undefined.
Fixes a bug where an error would be thrown when config options was undefined
during sheet upsert operations.
There was a bug causing the name of environment secrets not to update when
edited. This fixes the bug ensuring environment secret names are updated.
The `viewMapped` functionality has been updated to include new text strings for
messaging related to updating a table to only view mapped fields. The messages
include "Updating the table to only view mapped fields", "Halfway there, hang
tight...", "Almost done...", "Table update complete. Please audit the data", and
"An error occurred while updating the workbook. See Event Logs." These strings
will be displayed to the user during the process of updating a table to show
only mapped fields, providing status updates and handling potential errors. This
change enhances the user experience by providing clear communication during this
operation.
A bug fix was made to the handling of enum field options when updating a sheet.
Previously, if the `config.options` property was undefined, an error would
occur. Now, if `config.options` is undefined, it defaults to an empty array.
This ensures that the code can safely handle cases where `config.options` is not
present or is undefined. For example, if a developer was updating a sheet with
an enum field that did not previously have any options defined, the update can
now proceed without errors.
Added three keys: `addColumn`, `clearColumn`, and `removeColumn` with their
respective translations for adding, clearing, and removing columns.
In the English translation file, the title of the “mapValues” section has been
updated from “Map values” to “Map Fields”.
Updates to the mapping flow to prevent a user from accidentally navigating
"back" into a mapping flow after it has already been completed.
**Bug Fixes in Edit and Create App Forms**
We've resolved several issues affecting the Edit and Create App forms:
* Fixed form validation issues
* Improved error handling
* Enhanced user feedback
Updated the AutoBuild flow to allow editing the working space name during the
build process. The name can now be edited by clicking an icon next to the space
name, which reveals an input field to enter a new name. The name change is saved
when the input field loses focus or the Enter key is pressed. Added a badge
indicating "Dev Mode" when not in the production environment.
Simplified the loading step by removing the detailed progress indicators and
showing only a centered loader component.
Updated the BlueprintTab component to display a bottom border and align the
blueprint view action icons to the right.
Implemented sheet ordering functionality in the sidebar. Sheets are now
displayed in the order specified by the `sheetSidebarOrder` setting in the
workbook. Any sheets not included in the order are appended to the end.
The getDataDiff function has been updated to handle array values correctly when
comparing proposed changes with existing source (src) and artifact data.
If the proposed array value is the same as the existing array value in either
the source or artifacts, it will not be included in the changeset returned by
getDataDiff.
The function now uses the lodash isEqual utility to compare array values,
instead of strict equality (===), to account for differences in order or
reference of array elements.
For example, if the proposed tags array is \['jedi', 'sith'], it will not be
included in the changeset if the existing source tags are \['JEDI', 'SITH'] or if
the existing artifact tags are \['jedi', 'sith'], because the values are
considered equal despite differences in capitalization or array order.
This change ensures that unnecessary changes are not included in the changeset
when working with array values, reducing noise and improving the accuracy of the
changeset calculation.
`@flatfile/api@1.9.17`
Secrets have been updated to introduce new optional fields for filtering.
A new optional `actorId` field of type `ActorId` has been added to the
`ListSecrets` request type and `WriteSecret` type for secrets. This allows
filtering secrets by the associated actor (user, guest, agent, or api key).
`@flatfile/api@1.9.16`
Added new methods to for managing stored sheet constraints: `getConstraints`,
`createConstraint`, `getConstraintById`, `updateConstraint`, and
`deleteConstraint`.
`@flatfile/api@1.9.15`
With the addition of archived spaces there is now an ability to `unarchiveSpace`
method and `space:unarchived` event.
Deprecated use of metadata field at the cell value level, recommending use of
record level metadata instead.
`spaces-ui`
Updates to Snapshots resource to track changes to a sheet's schema (i.e. added
and removed columns). A more robust Snapshot will enable more complex features
when comparing 2 different snapshots.
`@flatfile/api@1.9.14`
#### Actions Resource
Actions are now a first class resource. Developers may now create, retrieve,
update, and delete actions via the api. New action events ActionCreated,
ActionUpdated, and ActionDeleted are now being published and an actionId field
has been added to the Context of events.
#### Agent Versioning
Agents versioning has been introduced. You may now retreieve agent versions via
the api and revert to a previous version via the api and the dashboard.
#### Token Refresh
Guests actively using their space will have their token's automatically renewed.
#### Sheet & Cell Updates
Support for array values in cell values has also been added.
#### Job Metadata
Metadta is now available on the `JobConfig` type, allowing additional data to be
associated with jobs.
Session name validation has been introduced to the dashboard, preventing the
creation of new sessions with the same name.

Note that sessions created via embedded or the api will not be constrained, this
is a dashboard only validation.
`@flatfile/api@1.9.13` `spaces-ui`
New fields 'guide' and 'guardrail' have been added to Actions. These options
enable providing markdown guidance and warnings to your users.
```typescript
actions: [
{
operation: 'submitActionFg',
mode: 'foreground',
label: 'Submit For Approval',
type: 'string',
description: 'Submit this data to a webhook.',
primary: true,
guide: {
content: "### Personalized Guide \n\n Before performing..."
},
guardrail: {
content: "### Warning! \n\n This Submit ..."
}
},
{...}
],
```
The markdown content of guide will become accessible to the end user via a
tooltip on the action element.
The `guardrail` component will render as a modal warning before the action event
is triggered. This can be useful for actions that have critical consequences or
require user acknowledgment before proceeding.
This can be useful for actions that have critical consequences or require user
acknowledgment before proceeding.
`spaces-ui`
The "requireAllValidTotal" option has been added for custom actions in the Sheet
toolbar. This new option requires all rows in the Sheet to be valid for the
action to be enabled, regardless of selection. If "requireAllValidTotal" is set
and there are any validation errors in the entire Sheet, the custom action will
be disabled with the corresponding tooltip message.
The existing "requireAllValid" option has been updated to only apply to the
selected rows, rather than all rows. If "requireAllValid" is set and there are
validation errors in the selected rows, the custom action will be disabled with
the corresponding tooltip message.
`spaces-ui`
Introduces support for handling locked sheets (indicated by the locked icon),
and improves the handling of import files.
`@flatfile/api@1.9.8`
A new `reference-list` property type has been added to allow defining an array
of values referenced from another sheet. Links will be established automatically
by the matching engine or similar upon an evaluation of unique or similar
columns between datasets.
The `ReferencePropertyConfig` has been updated to make the `relationship`
property optional, allowing it to be omitted.
`@flatfile/api@1.9.7`
New options have been added to the ListWorkbooksRequest to allow filtering
workbooks by name, namespace, label, and treatment, as well as controlling
whether to include sheets. For example, you can now pass a `name` parameter to
filter workbooks by name, or a `treatment` parameter to filter by treatment. The
`includeSheets` boolean parameter controls whether sheet data is included in the
response.
`@flatfile/api@1.9.6`
In AgentConfig type, a new optional property `sourceMap` of type string has been
added. This allows including a source map for the agent code.
`@flatfile/api@1.9.5`
A new property `treatments` of type `Flatfile.WorkbookTreatments[]` has been
added to the `Workbook` and `CreateWorkbookConfig` types. This array allows
specifying treatments for a workbook, with the currently available treatment
being `ExtractedFromSource`. This change introduces a new external interface
that developers using the package should be aware of when creating or updating
workbooks.
Example usage:
```typescript
import { Flatfile } from "@flatfile/api";
const workbookConfig: Flatfile.CreateWorkbookConfig = {
// ...other properties
treatments: [Flatfile.WorkbookTreatments.ExtractedFromSource],
};
```
`@flatfile/api@1.9.3`
A new `JobOutcomeTrigger` type has been introduced to specify whether a job
outcome's effect should be triggered automatically or manually.
**Auto-Expanding Cell Input in Editor Component**
We have enhanced the cell editing experience with an auto-expanding cell input
that dynamically adjusts to fit the content. This improvement ensures that users
can view and edit their data without constraints, enhancing the overall editing
experience.
`spaces-ui`
Improvements have been made to error handling and user feedback in the file
import and mapping process. If an error occurs, a popover is displayed to the
user with the error message, providing clearer feedback on failures during the
import process.
**Select Header Row**
We've added a new feature that allows users to select the header row in the data
preview. This feature is particularly useful when the first row of your data
contains column headers, as it enables you to specify the header row for
accurate data mapping.
**Search for Users**
You can now search for users in the manager users page. This feature allows you
to quickly find users by name, email, or role, making it easier to manage your
user base.

**Resend User Invite**
There is now a `/users/:userId/resend-invite` endpoint enabling admins to resend
an invitation to a user who has not yet accepted it.
**Bulk Deletion of Mapping Rules**
We added a method to delete multiple mapping rules from a program. This new
`deleteMultipleRules` method simplifies the management of mapping rules by
allowing bulk deletions. The `DeleteMultipleRulesRequest` type represents the
request payload, detailing the array of rule IDs to be deleted.
**New Method to Delete Apps**
We have introduced a new `delete` method that allows you to delete an app.
**Enhanced Validation Messages for Records**
The `ValidationMessage` type has been enhanced with two new optional properties:
`field` and `path`. The `field` property specifies which field the validation
message pertains to, while the `path` property, of type `JsonPathString`,
specifies the JSONPath for the validation message. These enhancements provide
more context and precision in validation feedback.
**Enhanced Job Outcomes with Custom Views**
The `JobOutcomeNext` type now includes a new `view` option, allowing jobs to
specify a custom view that should be displayed upon completion. To support this,
we introduced the `JobOutcomeNextView` type, which provides details for the
custom view, including the sheet ID, hidden columns, and an optional label.
**New Method to Update Sheets**
We introduced a new `updateSheet` method that allows you to update a sheet's
name, slug, and metadata. The `SheetUpdateRequest` type represents the update
request payload, including the name, slug, and metadata of the sheet.
Additionally, an optional `metadata` property has been added to the `Sheet` and
`SheetUpdate` types, allowing for the storage of contextual metadata related to
the sheet.
`spaces-ui`
A new "Update View" button can now be found in the job outcome modal for jobs
that modify the visibility of columns in a workbook sheet. This button triggers
updating the column visibility based on the job outcome.
There are 2 new capabilities to the next property of Job Outcomes:
* `view` gives developers the ability to manipulate the Sheet's view on
completion of a Job - Giving control over things like applying a filter or
hiding a column
* `download` gives developers the ability to include file details (such as a
fileName and URL), which are then used to trigger a download to the user's
browser on completion of the job
**New Environment Lifecycle Events**
We've added new domain `Environment` events expanding the scope of domains our
system recognizes. To leverage these events, you can listen on to the following
topics: The corresponding event topics are `environment:created`,
`environment:updated`, and `environment:deleted`.
**Enhanced Job Configuration Descriptions**
We have added descriptions for the DeleteRecordsJobConfig properties to provide
better clarity: An optional `filter` param allows allows you to provide options
to filter records, with the default set to none while the `filterField` param
enables narrowing the valid/error filter results to a specific field, but it
requires the filter to be set.
We corrected a typo in the `JobOutcomeNextFiles` type by renaming the file
property to files to accurately represent an array of file objects.
`spaces-ui`
#### Saving and Sharing Custom Views
Users can now apply filters, sorting, and search queries to the Sheet data and
save those settings as a reusable view. Key changes include:
* Added a "Views" dropdown in the Sheet toolbar to manage and apply saved views.
* Added a "Save View" modal that allows users to name and save the current Sheet
filters/sorting as a view.
* Users can copy a sharable link with the view settings applied.
* Saved views are grouped into "My Views" and "All Views" sections.
* Added backend APIs to create, update, and delete saved views.
* Added new React hooks and components to support the saved views functionality.
* Updated translations for the new UI elements related to saved views.
**Updated `ListDocumentsResponse`**
The `data` property now returns an array of `Document` objects instead of the
previous `DocumentResponse`. This change standardizes the format and improves
consistency in how document data is handled.
**Enhanced Job Configuration**
We added a new `predecessorIds` property to `JobConfig`. This property allows
you to specify job dependencies, meaning you can list the IDs of jobs that must
be completed before the current job can start. This helps in managing and
sequencing job execution more effectively.
We added a new feature to handle job outcomes more effectively with a new type
called `JobOutcomeNext` and its variant files. This includes:
* `JobOutcomeNextFileObject` Stores individual file information with an ID and
optional label.
* `JobOutcomeNextFiles` Handles multiple files, including an array of
JobOutcomeNextFileObject instances and an optional label.
We also added a waiting value to the JobStatus enum to indicate when a job is
paused or pending, improving job processing workflows.
**Enhanced Views Request and Response**
We updated `ListViewsRequest` to include optional `pageSize` and `pageNumber`
properties, allowing for pagination when retrieving views. We've also added a
`createdBy` property to `View` and `ViewResponse` to track the creator of each
view.
**Improved event emission logic for workbook creation and update**
Previously, the `workbook:created` event was emitted even when the workbook was
only being updated. Now, the logic emits a `workbook:updated` event when
necessary. Additionally, the `POST /:workbookId/rebuild` endpoint now handles
eventing properly.
**Enhanced Date Parsing Capabilities**
**Expanded Support for Date Formats**:
We've broadened our parsing algorithms to accurately recognize and sort a wider
variety of date formats. This update ensures that when users upload files
containing date fields, the system robustly handles various international date
formats and styles.
**Improved Sorting Accuracy**:
Dates are now correctly sorted based on their actual chronological order,
regardless of the format input. This enhancement reduces errors and
inconsistencies previously encountered with date sorting, ensuring data
integrity and reliability during file uploads.
**User Experience Improvement**:
Users no longer need to modify or standardize date formats in their files before
uploading. Flatfile automatically interprets and processes diverse date inputs,
simplifying workflows and reducing manual data preprocessing.
**Sorted Workbooks in Mapping**
Selecting a Sheet in the mapping flow has been updated to apply the same sort
order consideration as the sidebar. This enhancement ensures consistency in
workbook sorting for improved user navigation.
**Origin info now exists in Files table**
Now, origin tracking is available for file uploads, supporting sources like
Google Drive, File System, Box, and OneDrive.
**Improved Duplicate Field Key Validation in Workbook Sheets**
**Case-Insensitive Checking**:
Our latest update enhances the validation process by identifying duplicate field
keys in workbook sheets, including case-insensitive comparisons. This ensures
that field keys are unique regardless of character casing, maintaining data
integrity and consistency across your datasets.
**Data Integrity Assurance**:
By preventing the entry of duplicate keys, even when differences are only in
letter casing, we enhance the accuracy and reliability of data processing within
our application.
When uploading a file, you'll now see upload speeds in a user-friendly format
(KB/s, MB/s, GB/s).
**Introducing customizable column sizes in the Flatfile Blueprint**
A new new sizing options has been added for string fields to enhance visual
customization and user interface consistency.
See the new
[property](https://reference.flatfile.com/api-reference/workbooks/create#request.body.sheets.fields.string.appearance).
**Enhanced Workbook Build Process and Diagnostic Logging**
**Extended Wait Time for Workbook Readiness**:
To accommodate more complex data processing needs, we have extended the maximum
wait time for workbook readiness from 30 to 120 seconds. This adjustment ensures
that larger or more complex workbooks have sufficient time to complete their
build process without interruption.
**Added Timeout Functionality**:
We've introduced a new timeout feature that automatically stops the workbook
building process if it exceeds the allotted time. This prevents prolonged waits
and potential system overloads, improving overall system reliability.
**Improved Logging Capabilities**:
To aid in troubleshooting and optimize workbook build performance, we have
enhanced our logging system. Logs now include additional context information,
offering deeper insights into the workbook building process and helping identify
and resolve issues more efficiently.
**Enhanced Cell-Level Control in Records**
We've introduced a new configuration property that allows users to set
individual cells within a record to read-only. This enhancement extends our
previous functionality where only entire columns or sheets could be designated
as read-only. Now, you can apply more granular control over data manipulation by
restricting editing at the cell level, improving data integrity and compliance.
See the
[reference](https://reference.flatfile.com/api-reference/records/update#request.body.config.readonly).
**Support continuing to the next row on enter key press**
Users can now seamlessly move to the next row in the cell by hitting the enter
key. This enhancement improves the user experience and streamlines data entry in
the Flatfile platform.
**Multiline cell editing**
Users can now edit text in multi-line mode, enhancing the editing experience
within the platform.
**Introducing: Box Integration**
You can now upload a file from Box. Once you've connected your Box account, you
can select a file to upload directly into Flatfile.
**Introducing: Default Apps**
You can now set the default App for your Account. This App will be your default
landing page. Additionally, any unlinked Spaces will appear here.
**Refined AI Mapping Precision**
We’ve fine-tuned the interplay between AI-generated mapping suggestions and
user-defined mapping history to enhance accuracy and user trust in our system.
Previously, AI recommendations marked as “very confident” could override
mappings with a user history score greater than 0.99. To address this, we’ve now
implemented a cap on AI mapping scores, ensuring they do not exceed 0.99. This
change guarantees that high-confidence user history takes precedence, fostering
a more reliable and user-centric mapping experience.
**Streamlined User Invitation Process**
We've enhanced our user invitation system for improved clarity and data
integrity. Each user can now hold only one active invitation at any given time.
Any new invitation issued to a user will automatically replace the previous one.
This measure is designed to maintain data accuracy and eliminate user confusion.
**Intuitive Column Management in Real-Time**
We've refined our column management experience to align with your expectations
for immediate, responsive interactions. Key updates include:
Instant Feedback: Adjustments made within the column management panel now
reflect instantly on the table. This immediate update ensures you can
dynamically manage your view without waiting for panel closure.
Enhanced User Experience: This change addresses feedback regarding the previous
UX, where updates only occurred post-panel closure, leading to confusion. Now,
you can see the impact of your selections or deselections in real-time, making
for a more intuitive and satisfying user experience.
These improvements are designed to make your data management processes more
efficient and user-friendly, allowing for seamless adjustments to your viewing
preferences on the fly.
**Introducing `lastActivityAt` - Your Insight into Space Dynamics**
`lastActivityAt` is designed to provide you with comprehensive insights into the
activity within your Spaces. Here's what you need to know:
**What is lastActivityAt?**
An optional datetime attribute that marks the date of the last significant
activity within a Space. Activities considered include, but are not limited to,
adding records to a sheet, uploading files, or altering the configuration of a
workbook.
**Precision and Insight:**
Tracked with second-by-second-level precision, `lastActivityAt` offers a
valuable overview of your Space's engagement and operational dynamics, helping
you understand user behavior and space utilization better.
Reach out with feedback as we continue to enhance the utility of this feature to
support your needs better.
[Learn more](https://reference.flatfile.com/api-reference/spaces/get#response.body.data.lastActivityAt)
**Resolved Document Auto-Update Issue**
We fixed an issue where Documents weren't automatically updating in response to
event stream activities (such as document creation, updates, or deletions). This
fix ensures that document changes are now promptly reflected in the UI, keeping
your Space synchronized with all recent activities.
**Enhanced Metadata Management in `@flatfile/plugin-record-hook`**
We've resolved a critical bug in the RecordHook functionality, where `metadata`
wasn't correctly assigned during record updates. By overhaulubg the workflow for
setting and clearing metadata across various scenarios, we've can ensure
accurate and consistent metadata management.
**Introducing Email Theming for Pro Plan Customers**
Users can now update their space metadata object with a theme configuration to
customize the look of their emails. When this feature is flagged on, users can
enjoy custom theming in their email communications.
```js
{"theme":
{"email":
{"logo":"https://i.imgur.com/xuzxTAU.png",
"textColor":"#FDEBF7",
"titleColor":"#FFC0CB",
"buttonBgColor":"#A7C7E7",
"backgroundColor":"#34495E",
"buttonTextColor":"#FFFFFF",
"footerTextColor":"#F4D1D1"}
}
}
```
Learn more about [Theming](../guides/theming.mdx).
**Enhanced Enum Mapping Logic for Datasets with No Values**
We’ve addressed a nuanced issue concerning enum mapping and rule application in
datasets with no values. Previously, the system saved enum mapping rules with a
logic in place to reuse these rules if every value in a new dataset matched the
rule. This approach inadvertently included scenarios where datasets had no
values, considering them as matching all rules due to the absence of values.
Now, the system filters and applies enum mapping rules based solely on the
values present in the new dataset. This change ensures that rules are applied
more accurately, enhancing the system’s logic in handling datasets, especially
those without any values. This update aims to provide a more reliable and
logical framework for data processing and rule application.
**Dashboard authentication now lasts 24 hours**
Enjoy longer, uninterrupted sessions when working with your Flatfile Dashboard.
**Enhanced File Upload Experience: Multi-Selection and Drag-and-Drop Support**
We've upgraded the Files View with new capabilities to improve your file
management workflow. Now, you can select and upload multiple files
simultaneously through the File Uploader, streamlining the process.
While the upload progress is displayed for the initial file to maintain clarity,
rest assured that all selected files are being uploaded.
Additionally, the drag-and-drop feature has been enhanced to support multiple
files, making it easier than ever to upload documents directly into the Files
View. This update is designed to enhance productivity and simplify your data
management tasks.
**Optimized Space Loading Series: Reduced Requests for Translations**
We've streamlined the loading process for Spaces by eliminating an unnecessary
request for translations that previously triggered an additional getSpaceById
call. This refinement reduces the load time, ensuring Spaces are displayed more
promptly for a smoother user experience.
**Enhanced Filtering on `GET /spaces` Endpoint with Namespace Parameter
Support**
We've upgraded the `GET /spaces` endpoint to support an empty value for the
`namespace` query parameter. This update allows for more nuanced filtering,
specifically enabling the identification of spaces without a namespace. When the
namespace parameter is omitted entirely, the endpoint now returns spaces across
all namespaces, providing greater flexibility in data retrieval and management.
**Optimized Space Loading Performance Series: Deferring Blocking Requests**
By strategically eliminating or deferring five key blocking requests, we've
streamlined the data fetching process. Key improvements include the removal of
non-essential data fetches, optimizing the placement of the i18n context for
more efficient rendering, and introducing a new utility for pre-processing date
formats.
These updates collectively result in a smoother, faster user experience,
reinforcing our commitment to efficiency and performance.
**Optimized Space Loading Series: Reduced Requests for Translations**
We've streamlined the loading process for Spaces by eliminating an unnecessary
request for translations that previously triggered an additional getSpaceById
call. This refinement reduces the load time, ensuring Spaces are displayed more
promptly for a smoother user experience.
**New: Advanced Column Filtering Feature**
Beyond the existing functionalities of sorting, replacing empty values, and
conducting searches within fields, we’ve now integrated a precise value
filtering option for columns. This new feature includes a convenient search
mechanism to effortlessly pinpoint and filter by the specific value you’re
seeking. Streamline your data analysis and management with this robust filtering
capability.
**Enhanced Data Mapping Consistency with user created custom fields**
In our continuous efforts to streamline data import processes, we've implemented
an improvement targeting the scenario where mapping rules from previous imports
include destination fields no longer present in the current schema. This
situation can arise if a custom field was added during a past import session and
the data being imported is from a completely new sheet, often observed in
embedded imports. With this update, we ensure a smoother data mapping experience
by automatically filtering out rules that do not match the current destination
blueprint, maintaining consistency and accuracy in your data integration
efforts.
**Introducing the Flatfile Dedicated Mapping APIs**
We’re excited to announce the launch of the Flatfile Dedicated Mapping API, a
significant enhancement to our suite of tools. This new API, encapsulated within
a user-friendly Python library, is designed to fully harness Flatfile’s mapping
capabilities. It provides a robust solution for mapping schemas directly within
your systems, ensuring that your data remains secure in your database. This API
is ideal for a range of applications, from data pipelines and API integrations
to complex data conversion workflows.
The essence of mapping with this API is the transformation of source records
into target records. This process ranges from straightforward tasks like
renaming fields (for example, changing “first\_name” to “firstName”) to more
complex operations such as extracting substrings from fields or concatenating
multiple fields.
Our Mapping API and accompanying Python library bring forth new opportunities
for efficient and precise data mapping, offering customization to meet your
unique needs and workflow requirements.
To get started, refer to the
[README](https://github.com/FlatFilers/flatfile-mapping).
**Expanding Global Reach with Additional Languages**
We've added an expansion of our language support to include a variety of new
translations. This update builds upon our existing capabilities, which
previously covered French, German, Indonesian, Portuguese, and Spanish.
**What's New?**
* Italian (it)
* Japanese (jp)
* Korean (kr)
* Brazilian Portuguese (pt-BR)
* Turkish (tr)
* Vietnamese (vi)
* Chinese (zh)
Our goal is to continuously add languages that resonate with our diverse user
base. If there's a language you need that we haven't yet included, let us know!
**New Feature Added: Disabling Actions for Empty Records in Workbook &
Sheet-Mounted Actions**
Developers can now add an optional flag to Actions, preventing users from
initiating Actions when Sheets or Workbooks don’t yet have any records. This
helps avoid unnecessary job failures and ensures Actions are performed only once
data is present.
```json
actions: [{
"constraints": [{ "type": "hasData" }]
}]
```
[Learn more](../orchestration/actions) about all the constraints available for
Sheet and Workbook Actions. or
[See the Guide](../guides/actions#workbook-and-sheet-mounted) to see it in use.
**Introducing Composite Uniqueness for Enhanced Data Integrity**
We’re delighted to introduce a new constraint in Sheets aimed at bolstering data
integrity. This constraint guarantees that a designated combination of two or
more fields maintains uniqueness throughout the entire Sheet.
To activate this feature, it’s as simple as adding certain parameters to the
constraints property in your Sheet configuration.
[See a breakdown](../blueprint/constraints#composite-uniqueness) of the
necessary parameters to effectively implement composite uniqueness.
**Powered by Flatfile looks a little different now**

Based on user feedback, we've updated the visual design of the "Powered by
Flatfile" chip to distinguish it from actionable buttons, enhancing the user
interface for clarity.
For those interested in a fully branded experience, we offer the option to
remove this chip entirely through our [Professional
package](https://flatfile.com/pricing/).
**Improvements for `trackChanges` functionality**
For users utilizing the `trackChanges` on their Workbooks (disables your actions
until all commits are complete), we’ve addressed a critical workflow concern.
Changes in a Reference field result in the creation of a new commit event, which
was leaving the original commit incomplete. Furthermore, this new commit wasn’t
triggering an event emission, leading to a chain of incomplete commits.
We’ve refined the commit completion process: now, commits will be marked as
complete whenever the user explicitly references a commit in an update request.
This adjustment is handled out of the box with `@flatfile/plugin-record-hook`
and ensures a more streamlined and reliable change tracking experience.
**New: At-a-glance Insights are now available in the Sidebar**

Enhancements to the Spaces sidebar now offer at-a-glance insights for greater
efficiency: each workbook's entry displays a count of contained Sheets,
alongside a tally of records within each Sheet. Additionally, the validity of
records is intuitively indicated with color-coded dots—green for valid and red
for invalid—directly within the Sheet summary. This allows for quick
identification and management of data accuracy.
#
We've optimized the file deletion process for large files. The next time you
remove a sizable file, expect to experience a noticeably faster removal speed.
**Toolbar blocking jobs Extended to Workbook-Mounted Actions**
We've extended the functionality of the `toolbarBlocking` job mode beyond
individual Sheets. Now, when it's applied to Workbook-mounted actions, this mode
also disables workbook actions (including all of the Sheets actions/toolbars)
during a job's progress. By its nature, it also still allows interaction with
the actual data tables.
This enhancement provides more consistent control throughout your entire
Workbook, ensuring tasks are completed efficiently and without accidental
interference or unintended navigation.
[Learn more](../orchestration/actions#optional-parameters)
**State-Based Messaging on Actions via Tooltips**
We've introduced an exciting new capability to enhance user interaction and
feedback in our application. You can now add custom messages to actions,
tailored according to their state:
* Error
* Info
These messages will be displayed as tooltips when users hover over an action,
providing context-specific text that corresponds to the action's current state.
[Follow the guide](../guides/actions#usage-7)
**Decoupling Actions Updates from the Workbook's Schema**
Previously, using workbook update endpoint necessitated submitting all the
fields, even for minor adjustments to an action. Now, with our improved `PATCH`
endpoint, you can update actions without needing to include the `fields`
property in your payload.
This refinement simplifies the update process, focusing solely on the changes
you intend to make.
[See the endpoint](https://reference.flatfile.com/api-reference/workbooks/update-a-workbook)
Previously, users experienced difficulties in adjusting column sizes correctly
when other columns were either hidden or pinned. This fix ensures seamless and
accurate resizing of columns, regardless of the visibility or status of adjacent
columns, enhancing the overall usability and functionality of the data table.
**Enhanced Visibility with Status Updates During File Extraction**

A Toast now displays intermediate states to keep you informed every step of the
way during file extraction.
When uploading a file directly to the table, you’ll see statuses like “Waiting
for Extraction”, “Extracting”, and “Unsupported File”, along with detailed
information about the outcome of extraction jobs.
When uploading on the Files page, the error message is toned down to an info
message, as uploading files without an extractor is a legitimate scenario.
These updates are aimed at enhancing transparency and efficiency, ensuring
you’re well-informed throughout the file extraction process.
[See extractor plugins](https://flatfile.com/plugins/category/extractors/)
**Improved Search Functionality for Null Values in GET Records Endpoint**
We've refined the search mechanism for empty cells in the GET records endpoint.
In the past, users would specify `null` in the `searchValue` parameter to locate
empty cells. However, this method posed ambiguity as it didn't clearly
distinguish between an exact search for null values and a partial search for the
string "null".
Now, to accurately search for cells with null values, you should use empty
double quotes `""` in the `searchValue` parameter.
[See the endpoint](https://reference.flatfile.com/api-reference/records/get-records)
**Enhanced Theming Tabs, Tooltips, and Popovers**
We're excited to announce a broad range of new theming capabilities that enhance
the visual appeal and customization of our platform:
**Usage**
```jsx
{
"theme": {
"root": {
"borderColor": "pink",
"modalBorderRadius": "10px",
"popoverBorderRadius": "8px",
"tabstripActiveColor": "deepskyblue",
"tabstripInactiveColor": "orange",
"popoverBackgroundColor": "limegreen",
"tabstripHoverTextColor": "gold",
"tooltipBackgroundColor": "hotpink",
"tabstripHoverBorderColor": "purple"
}
}
}
```
1. Theming for Tabs: Customize the look and feel of tabs to better match your
application's design and user experience.
2. Enhanced Tooltip Styling: Gain more control over tooltip aesthetics with
expanded theming options, ensuring they align seamlessly with your interface.
3. Refined Borders Across Components: Apply custom themes to borders, providing
a sharper and more cohesive visual distinction for various elements.
4. Popover Customization: Tailor the appearance of popovers with new theming
capabilities, enhancing their integration within your application's layout.
5. Modal Window Styling: Elevate the design of modal windows with customizable
theming options, contributing to a more engaging and harmonious user
experience.
These updates offer a greater degree of flexibility in customizing the
appearance of key UI components, allowing for a more integrated and visually
consistent application design.
[Learn more](../guides/theming).
**Field Descriptions now have Markdown Support**

We're delighted to announce that field descriptions now support Markdown
formatting. This enhancement enables you to create more informative and engaging
field descriptions, complete with formatted text and hyperlinks.
**Usage**
```jsx
"fields": [
{
"key": "code",
"label": "Product Code",
"description": "This can be **markdown**."
"type": "string"
},
]
```
* Rich Formatting Capabilities: Leverage the power of Markdown to format your
field descriptions, making them more readable and useful.
* Incorporate Links: Easily include links in your field descriptions, providing
direct access to additional resources or related information.
* Versatile Display Across Interfaces: These enriched descriptions are displayed
as tooltips within the data table and in mapping. Additionally, the full text
of the descriptions is available in the data checklist for comprehensive
insights.
This update is designed to improve the clarity and effectiveness of field
descriptions, enhancing the overall user experience in data interaction and
comprehension.
Learn more about [Blueprint](../blueprint) and Fields.
**Default Value Added to `inputForm` fields**
We've enhanced the `inputForm` feature in Actions by supporting `defaultValue`
settings for enums, strings, and textarea field types.
**Usage**
```jsx
"inputForm": {
"type": 'simple',
"fields": [
{
"key": 'planName',
"label": 'Plan Name',
"type": 'string',
"description": 'The name of the plan',
"defaultValue": 'Default plan'
},
]
}
```
This update allows for more intuitive and efficient data entry, as default
values can be pre-filled in these fields, streamlining the user experience.
Learn more about [Actions and Input Forms](../orchestration/actions#inputForm).
**Enhanced Theming Interactive options**

The latest updates in our theming capabilities bring a fresh look and enhanced
customization to various UI elements:
**Usage**
```jsx
{
"theme": {
"root": {
"badgeBorderColor": "green",
"interactiveBorderColor": "orange",
"interactiveBorderRadius": "0px"
},
"table": {
"cell": {
"active": {
"borderWidth": "2px",
"boxShadow": "3px 3px 0px 0px #000000"
}
},
"tooltip": {
"borderRadius": "0px"
},
"lookupEditor": {
"option": {
"borderRadius": {
"hover": "0px",
"active": "8px"
},
"backgroundColor": {
"hover": "yellow",
"active": "lavender"
}
}
}
}
}
}
```
1. Enhanced Dropdown, Text Input, and Context Menu Theming: Users can now apply
customized themes to dropdowns, text inputs, and context menus, offering a
more cohesive and personalized interface design.
2. Refined Active Cell Styling in Tables: The active cell in the table now
features themable border width and box shadow, adding depth and focus to the
selected cell.
3. Upgraded Tooltips in Tables: Tooltips in the table now support theming for
border radius and box shadow, allowing for smoother integration with the
overall design aesthetic.
4. New Optional Border for Badges: An optional border can now be added to badges
within the app. This update extends to TT badges in enum columns, providing a
consistent and visually appealing element across the platform.
These theming enhancements are part of our ongoing commitment to provide a
versatile and visually engaging user experience, allowing for greater
consistency and branding alignment across our platform.
[Learn more](../guides/theming).
**Enhanced Theming options for Checkboxes**

We're pleased to announce the expansion of our theming capabilities with new
customization options for checkboxes:
**Usage**
```jsx
{
"theme": {
"root": {
"checkboxBorderRadius": "0px",
"checkboxBorderColor": "red"
},
"table": {
"inputs": {
"checkbox": {
"borderColor": "magenta"
}
}
}
}
}
```
1. Customizable Checkbox Border Radius: Users now have the flexibility to theme
the border radius of checkboxes, allowing for a more personalized and
visually cohesive interface.
2. Theming for Checkbox Border Color: Alongside border radius customization,
users can also theme the border color of checkboxes, adding another layer of
visual customization to match your application's aesthetic.
3. Override Options in Table View: In table contexts, users have the added
ability to override the default checkbox border color, offering even more
control over the table's appearance and consistency with your overall design
theme.
These enhancements aim to provide greater flexibility and control over the UI,
enabling users to tailor the look and feel of checkboxes to better align with
their unique branding and design preferences.
[Learn more](../guides/theming).
**Enhanced Theming options for Badges+**

We're excited to announce new additions to our theming capabilities, allowing
for even more customization and a refined user experience.
**Usage**
```jsx
"theme": {
"root": {
"pillBorderRadius": "0px",
"badgeBorderRadius":"10em"
},
"table": {
"buttons": {
"pill": {
"color": "white",
"backgroundColor": "magenta"
},
"iconColor": "magenta"
}
}
}
```
**What's New:**
1. Pill and Badge Border Radius Customization: You can now tailor the border
radius of pills and badges within space theming. This update enables you to
define and apply a consistent look across all pills and badges in the space
according to your design preferences.
2. Toolbar Button Color Theming: We've also introduced the option to customize
the color of toolbar buttons, giving you more control over the visual style
of your toolbar.
3. Inherited and Customizable Toolbar Filters: Toolbar filters will
automatically inherit the pill border radius settings for a unified
appearance. However, this can be overridden to suit specific design needs,
offering both convenience and flexibility.
These enhancements are part of our ongoing commitment to providing a versatile
and user-friendly platform that caters to your unique theming requirements.
[Learn more](../guides/theming).
**Revamped UI for Foreground Jobs**

We've given our foreground jobs a fresh, new look:
1. Enhanced Job Modal Header: The job modal header now prominently displays job
information, making it instantly visible and accessible.
2. Refined Display Details: Information that was previously in the header is now
elegantly presented as an ultralight subline for improved readability.
3. Optimized Layout: We've repositioned the display of the percentage and
remaining time for better clarity and focus.
4. Improved Time Estimation: For estimated completion time, we've introduced a
fallback mechanism based on simple linear extrapolation, ensuring more
accurate and reliable predictions.
These updates are designed to offer a more intuitive and streamlined user
experience in tracking and managing foreground jobs.
**Next Action Links in Job Outcome Dialogs Now Support Linking to internal
resources**

In addition to the existing ability to link to external URLs and trigger
downloads, you can now also display links to internal resources upon job
completion. This enhancement broadens the scope of actions you can perform,
offering more versatility in directing users to relevant resources or actions
post-job completion.
In this code below, we will create a button that says “See all downloads” with
this path: `/space/us_sp_1234/files?mode=export`
**Usage**
```jsx listener
await api.jobs.complete(jobId, {
outcome: {
message: `The file has been created`,
acknowledge: true,
//Reference: https://platform.flatfile.com/s/space/{$id}/{path}?{$query}
next: {
type: "id",
id: "dev_sp_1234",
path: "files",
query: "mode=export",
label: "See all downloads",
},
},
});
```
This improvement adds versatility and flexibility to your job outcomes,
enhancing user interaction and experience.
[Learn more](../orchestration/jobs#next-links).
**Next Action Links in Job Outcome Dialogs Now Support Retrying**
You can now also display links to retry a job upon job completion or failure.
This enhancement broadens the scope of actions you can perform, offering more
versatility in directing users to relevant resources or actions post-job
completion.
**Usage**
```jsx listener
await api.jobs.complete(jobId, {
outcome: {
message: `The file has been created`,
acknowledge: true,
next: {
type: "retry",
label: "Try again",
},
},
});
```
This improvement adds versatility and flexibility to your job outcomes,
enhancing user interaction and experience.
[Learn more](../orchestration/jobs#next-links).
**Enhanced Theming options for Filters**

We've expanded the theming capabilities for filters, providing a more
customizable and visually appealing interface:
**Usage**
```jsx
"theme": {
"table": {
"filters": {
"outerBorder": "2px red solid",
"innerBorderRadius": "0px",
"outerBorderRadius": "0px"
}
}
}
```
1. New Styling Features for Filters: We've introduced the option to customize
filters with borders and border radius as part of the table theme.
2. Flexible Border Radius Design: To accommodate the varying sizes of elements
and the potential difference in calculated border radius, we now use both
inner and outer border radius settings. This enhancement not only ensures a
cohesive look but also adds an extra layer of styling flexibility to the
filters.
These improvements are designed to offer more control over the aesthetics of
filters, enabling a seamless and integrated visual experience.
[Learn more](../guides/theming).
**Introducing New Endpoint for Workbook Commit Retrieval**
We're excited to announce the addition of the new GET
`/workbooks/{workbookId}/commits` endpoint. This improvement streamlines the
process of retrieving all commits associated with a specific workbook, offering
a more efficient way to determine the completion status of all processing
activities on that workbook.
While this enhancement primarily benefits the Flatfile UI, the endpoint is fully
exposed, making it a valuable tool for developers looking to optimize commit
retrieval and processing status checks in their own applications.
[See the endpoint](https://reference.flatfile.com/api-reference/workbooks/get-commits-for-a-workbook)
We've successfully addressed and fixed the issue related to extracting/uploading
files with special characters in their names. If you previously encountered
difficulties with such files, this update resolves those challenges, ensuring
smoother and more reliable file handling going forward.
**Improved Handling of Long-Running Jobs with New Timeout Feature**

To enhance clarity and efficiency in job processing, we’ve implemented an update
for managing long-running jobs. Previously, jobs that weren’t acknowledged
within 5 minutes would silently fail due to a timeout by the agent, leading to
confusion as these jobs never reached completion (They were left in an executing
state).
To address this, we’ve now added a routine check that occurs every 5 minutes.
This check will automatically fail jobs that are still executing but haven’t
been
[acknowledged](https://reference.flatfile.com/api-reference/jobs/acknowledge-a-job)
(ack’d) within this timeframe. This update ensures better transparency and
control in the job execution process, reducing confusion and streamlining
operations.
**You can now disable Actions when there are invalid records**

Previously, developers faced challenges in determining when all rows were valid,
essential for preventing unintended data egress.
Now, developers can now add an optional flag to Actions, preventing users from
initiating Actions when Sheets or Workbooks contain invalid records.
```jsx
actions: [{
...
constraints: [{ type: 'hasAllValid' }]
...
}]
```
Add a `constraints` parameter in your Action that includes `type: hasAllValid,`
to disable actions when there are invalid records.
[Learn more](../guides/actions#usage)
We made an enhancement to bulk cell pasting. Previously, locking the cell
occurred prematurely, causing values to flash after pasting. Now, locking is
effectively executed later, resulting in a smoother and more stable pasting
experience.
**Space ID Search is now available in the Spaces Table**

We've added a new feature that allows you to search for spaces using their Space
ID in the Spaces table. This enhancement simplifies the process of locating
specific spaces quickly and efficiently.
**Enhanced Weighting Mechanism for User History in Enum Mapping**
We assign an 80% value to other users' history compared to the current user's
history. Previously, this was only applied to field mapping, Now, this approach
has been extended to enum mapping as well.
This update addresses an issue where conflicting histories between two users
could result in one user being unable to edit due to the other's history taking
precedence.
With this enhancement, the system now more effectively balances historical
inputs from different users, ensuring smoother and more equitable enum mapping
automation.
*It's the little things*: Bulk delete in Spaces is now available\*\*\*

We’ve added a new feature that allows for the selection and bulk deletion of
Spaces from your list. This enhancement is particularly beneficial for customers
managing numerous Spaces, as it eliminates the need to delete each Space
individually, streamlining the process and saving time. Enjoy!
**New: Column Visibility Control with Local Storage Memory**

We've introduced a new feature that allows for selective hiding or showing of
columns, enabling users to customize their view for more focused work.
Additionally, your column visibility preferences will now be automatically saved
to local storage, ensuring that your personalized settings are remembered for
future sessions. This update aims to enhance user efficiency by allowing you to
tailor the interface to your specific needs and workflow.
**Guests can now be named in a Spaces URL**
Previously, accessing Flatfile either directly through a URL or via an iframe
with an `accessToken` would categorize all user activities under an anonymous
guest. Now, if you know the `guestId`, interactions in Flatfile can be tied back
to that person.
This is made possible with a new endpoint, **/guests/:guestId/token**, which
generates an accessToken, and can be used to load the Flatfile space at
**[http://platform.flatfile.com/space/:spaceId?token=:accessToken](http://platform.flatfile.com/space/:spaceId?token=:accessToken)**.
With this method, Guests are now accurately named and identifiable, enhancing
user recognition and auditing.
[See the endpoint](https://reference.flatfile.com/api-reference/guests/get-guest-token).
**Speed Optimization for Large Enum Lists**
When either the source (file) or destination (Blueprint) list contains more than
100 enums, our mapping AI will now utilize a simplified algorithm. This change
addresses scenarios where customers have extensive enum lists, such as 5,000
different Shopify products, which previously slowed down our assignment
algorithm. With this update, the execution speed for such extensive lists will
be faster, ensuring a more efficient and responsive experience for clients with
large datasets.
**Fixed File Extraction for Workbook-Only Guests**
We’ve fixed an issue where guests with
[workbook-only access](../developer-tools/security/roles-and-permissions#workbook-grant)
could upload files but couldn’t extract them due to access restrictions. Guests
can now upload and extract files smoothly in workbooks they have access to.
**Full Whitelabeling now available in Pro+**

There are a few places where the Flatfile logo appears to your end customers.
1. The **Powered By Flatfile** logo appears in the bottom right of Workbooks.
2. When sending invitations to Users or Guests, the Flatfile logo is included at
the top of the emails.
For those seeking a complete white-labeling experience, we offer a fully
customizable solution as part of our Pro plans and above. Feel free to get in
touch with us at [support@flatfile.com](mailto:support@flatfile.com) to discuss
the possibilities of a fully whitelabeled experience tailored to your needs.
**Hide the info Tooltip**

Within each Space, we've implemented an informative tooltip providing essential
details about the Space. This tooltip serves as valuable support for addressing
customer inquiries and issues. However, we understand that there are scenarios
where you may prefer to hide this information from your customers.
To accommodate your preferences, we've introduced the option to hide this Space
tooltip. To make this adjustment, you can utilize the "Update Environment" patch
available in our
[API Reference](https://reference.flatfile.com/api-reference/environments/update-an-environment).
```jsx
metadata: {
showSpaceInfo: false;
}
```
In this update, we've implemented a crucial enhancement by unifying the casing
in our API. We've resolved issues where certain properties were inconsistently
cased, which had previously hindered their proper setting and retrieval.
With this unification, you can now seamlessly interact with our API, ensuring a
smoother and more reliable experience.
Previously, if an existing value was present in the enum field, you couldn't
select a different value from the dropdown. Additionally, when a dropdown field
was empty, you were unable to use the dropdown to select a valid option, and
typing out an option wasn't possible either. Both of these limitations have now
been resolved.
**Control your mapping accuracy**
We're excited to announce a new parameter within Sheet configuration, aptly
named `mappingConfidenceThreshold`. This parameter empowers you to fine-tune
your experience when working with automatically suggested fields during mapping
jobs.
With `mappingConfidenceThreshold`, you have the flexibility to configure the
minimum required confidence level. Think of it as a precision control, offering
you the choice between conservative (exact match) or liberal (fuzzy matching)
settings.
**How it works**
Set the value greater than 0 and up to a maximum of 1, for a more precise
control over the mapping process:
```jsx workbook.js
sheets: [
mappingConfidenceThreshold: .6,
{...}
]
```
[Learn more](../blueprint/sheet-options#mappingConfidenceThreshold)
**More Precise Filtering is now Available**
Now, FFQL queries made via filter: in the the search bar can be seamlessly
combined with other filters. Whether you prefer using tabbed filters like
“valid” and “error,” or you rely on the “Filter By Value” menu, you can now
harness the power of FFQL alongside these filters.
With this update, you’ll experience precise narrowing of row results every time
you apply FFQL queries in conjunction with other filters. No more guesswork—your
filtered results will be spot on, making it easier than ever to find the
information you need with pinpoint accuracy.
**Theming Column Headers**

You can now control the font of the headers in a table using the
`column.header.fontFamily` property.
[Learn more](../guides/theming#theme-table)
**Enhanced Cell Copying with Multi-line Preservation**
We've improved the functionality for copying cells from external spreadsheets
into our system. Now, when you copy cells containing more than one line, these
new lines are preserved in the cell value upon pasting.
While these additional lines may not be visually apparent in the table, you can
verify their presence by inspecting the data through the API, or by copying and
pasting the cell content into another application.
This update ensures better data integrity and consistency when transferring
content with complex formatting.
**Removed Character Limit on Mappings, Jobs, and Enum Mappings**
We have lifted the previous 255-character limit for mappings, jobs, and enum
mappings. This update allows for greater flexibility and more detailed entries
in these areas.
**New API Feature to Restore Archived Spaces**
API users now have the capability to restore an archived space. This can be done
by setting the `archivedAt` property of the space to null via the
`PATCH /spaces/:spaceId` endpoint.
**Improved Cascade Deletion for Spaces, Workbooks, and Sheets**
We've enhanced the deletion process for Spaces. In the past, deleting a Space
did not automatically cascade to its Workbooks and Sheets. Now, with the latest
update, soft deleting a Space will also soft delete its associated Workbooks and
Sheets.
Furthermore, we've refined the `GET /workbooks` query. It now filters out
workbooks linked to previously deleted Spaces, ensuring that only relevant and
active workbooks are displayed. This update is helpful for Spaces deleted before
this improvement was implemented.
**Enhanced Display of Enum Options in Turntable**
We've addressed an issue in the table where enum options without a label were
incorrectly displaying as 'undefined'. With the latest update, in cases where an
enum option lacks a label, the table will now default to showing the option's
value instead.
This ensures a more accurate and user-friendly display of data.
**Disable Actions while Hooks are Running**

Previously, developers faced challenges in determining when all hooks had
finished running, essential for ensuring data transformation completion and
preventing unintended data egress. As a workaround, they resorted to creating
placeholder fields that defaulted to invalid states. When a hook completed
processing for a record, it marked the field as valid, allowing submission only
when there were no errors in this field.
Now, we're thrilled to introduce a solution to simplify this process—an all-new
event: `commit:completed`. This event signals the end of all processing tasks.
```jsx
settings: [{ trackChanges: true }];
```
Add a `settings` parameter in your sheet that includes `trackChanges: true,` to
disable actions on both Sheets and Workbooks until any pending commits have been
completed.
[Learn more](../blueprint/sheet-options#settings)
**New Job Mode: `toolbar-blocking`**
We're excited to introduce a third job mode called `toolbarBlocking` alongside
the existing `foreground` and `background` modes. With this new mode,
`toolbarBlocking`, you have the flexibility to disable the Sheet Toolbar and
Column Header Menus while still allowing users to enter records manually.
[Learn more](../orchestration/actions#optional-parameters)
**Improved copy-paste experience while bulk selecting cells**
We've enhanced the bulk selection experience by enabling paste functionality.
When you paste, we simply transition you out of the selection state, similar to
when editing. Note: if you paste without selecting any cells, the operation will
automatically apply to the bulk-selected rows.
In the past, when your Space took more than 10 seconds to build, an "empty space
created" message would appear. With our latest update, as long as you
acknowledge the space:configure job, you can rest assured this dialog will no
longer appear.
**New: Embed Sheets inside Documents**
Now, by simply incorporating the `embed` HTML entity into your markdown body and
providing the sheet ID, workbook ID, and name, you can effortlessly embed Sheets
into Documents. Additionally, you have the flexibility to choose whether the
embedded Sheet starts in an expanded or collapsed state when the document is
loaded.
```html
```
This enhancement provides further freedom to tailor document-driven interactions
precisely to your needs.
[Learn more](../guides/documents#embedding-sheets-in-documents)
**Actions are now available on Documents.**

Documents, which are standalone webpages within your Spaces, can now host a
variety of actions, just like Workbooks, Sheets, and Files.
Document-Mounted Actions have their own configuration within a Document object.
The executable code within an Action is compiled into a Job entity, offering the
flexibility to run asynchronously or immediately. This empowers you to create
more interactive and dynamic Documents, enhancing the overall user experience
within your Flatfile Spaces.
```jsx workbook.js
//your document can now have an action that looks like this
actions: [
{
operation: 'sendAction',
mode: 'foreground',
label: 'Send to...',
description: 'Send this page to someone.',
primary: true,
},
{...}
]
```
[Learn more](../guides/actions#document-mounted)
**The API now supports two new actions: `sheets.lock` and `sheets.unlock`.**
With the introduction of these API actions, you gain the ability to manage sheet
locking within your application. The `sheets.lock` action allows you to lock a
sheet, preventing any further modifications to its content. Conversely, the
`sheets.unlock` action enables you to release the lock and restore full editing
capabilities to the sheet.
These new API actions provide greater control and flexibility when it comes to
managing sheet access and data integrity.
[See the endpoint](https://reference.flatfile.com/api-reference/sheets/lock-a-sheet)
**New: Full-screen Documents!**
Documents have gained a new capability, allowing them to seamlessly transition
into full-screen takeovers, thus removing their presence from the sidebar.
**How it works:**
To enable this functionality, we've introduced a new property on Documents:
`treatments`. This property allows users to define specific descriptors for each
Document. When a Document is assigned the "ephemeral" treatment, it will
transform into a full-screen overlay, visible exclusively within the Spaces
dashboard, while being discreetly tucked away from the Spaces sidebar.
This enhancement enhances user creativity and provides the freedom to tailor
document-driven interactions precisely to their needs.
```ts
await api.documents.create(spaceId, {
treatments: ["ephemeral"],
title: "Getting Started",
});
```
[Learn more](../guides/documents#ephemeral-documents)
**We’ve added Indonesia (ID) translations!**
Building upon our existing language support (based on your customers’ browser
locale), which already included French, German, Portuguese and Spanish
translations, we’ve expanded our capabilities to cater to an even broader
audience.
Need another language? Let us know and we’ll prioritize getting it added.
**Now Available: Set estimated time of completion on Jobs**

When acknowledging a job from a Workbook or Sheet action, you now have the
option to set the `estimatedCompletionAt` parameter. Setting the estimated time
of completion for the job leads to more informative and interactive features in
the UI.
1. In the Foreground Job overlay, you'll see real-time progress displayed as a
percentage, along with an estimate of the remaining time.
2. Additionally, the Jobs Panel will share visibility into the estimated
remaining time for acknowledged jobs.
[Learn more about using Actions](../guides/actions).
Previously, when uploading a file, actions could only be added to it after the
file was uploaded (by calling a file update via a listener). Now, with our
latest update, you can now include actions during file upload allowing actions
to be applied to the file immediately upon creation.
See the
[API Reference](https://reference.flatfile.com/api-reference/files/upload-a-file).
The find/replace operation is now able to handle large datasets. Now, records
are batched into groups of 10,000 rows, and a unique versionId is assigned to
each batch. Additionally, the workbook service will emit events for each batch
processed.
This improvement is particularly beneficial for @flatfile/plugin-record-hook,
which can now retrieve and respond to all records in a single find/replace
operation, thanks to the 10,000-row batching.
When adding a secret to a space via the API, we have expanded the flexibility of
secret names. While the API allows spaces within secret names, the UI previously
restricted them by filtering spaces in the regex validation logic for Secret
Names. With this update, space characters are now considered valid and accepted
characters in secret names, aligning the behavior between the API and the UI.
Learn more about [sharing secrets](../guides/secrets).
**Now Available: Accept Dynamic Inputs on Actions**

Now, when initiating an action, you have the option to gather additional
information from end users to facilitate the successful completion of the
intended task.
**Here’s how it works:**
Suppose you want to allow users to specify the name of the file they intend to
export. In such cases, you can configure input fields for your action. When an
end user triggers this action, a secondary dialog will appear, prompting them to
provide the necessary information.
**The available input types include:**
* `string`: For capturing text-based information.
* `textarea`: Ideal for longer text entries or descriptions.
* `number`: To collect numeric data.
* `boolean`: For simple yes/no or true/false responses.
* `enum`: Enables users to select from a predefined list of options.
**Two easy ways to get started:**
* Learn about all the parameters available on `inputForm` for Actions
[here](../orchestration/actions#inputform).
* Or follow this copy/paste [Guide](../guides/actions#actions-with-input-forms)
to quickly see it in action in your code.
**Exciting Update: Additional Fields**

With additional fields, end users can seamlessly map to fields that don’t exist
by simply adding a new property to any of your new or pre-existing Blueprint(s).
Here’s how it works:
* When you set allowAdditionalFields to true, your Sheet gains the ability to
accept additional fields, extending beyond what’s originally specified in its
configuration.
* These extra fields can be incorporated, either through API integrations or by
end users during the file import process.
* Fields that go beyond the Blueprint’s initial setup are marked with a
treatment property set to additional, ensuring complete transparency over what
extends the Blueprint.
* What’s more, adding a custom field is a breeze—its name will be automatically
set to match the header name from the file being mapped. This simplifies the
process.
[See the docs](../blueprint/sheet-options).
**Introducing New and Enhanced Event Logs**

Our aim is to make it easy for everyone to monitor events, find problems, and
help developers avoid overcomplicating their code with excessive Console logs
that would need removal later.
With our enhanced Event logs, you can now:
1. View a comprehensive list of logged events, categorized by event topic.
2. Easily discern the status of each event.
3. Gauge the execution time of each event.
4. Access events chronologically, with the most recent at the top.
5. Dive into each event for context and review any associated console logs.
As an added bonus, you have the capability to filter events, focusing solely on
failures.
See it live: [platform.flatfie.com/logs](http://platform.flatfile.com/logs)
Choosing a `namespace` on Space creation is now easier than ever. Rather than
manually typing the namespaces into a free-form text field, you can simply
choose from a dropdown menu of available options. The available namespaces are
conveniently listed within the `namespaces` array of the Environment, providing
a more efficient and accurate way to handle namespace selection.
Previously, when using the find and replace feature, only the initial batch of
records had data hooks run again. Now, the find/replace operation efficiently
groups records into batches of 10,000 rows, assigning a unique versionId to each
batch. This improvement empowers record-hooks to process all records seamlessly
within a single find/replace operation.
Now, you can include a free-form JSON object as `metadata` on your Workbook,
providing you with a flexible way to store additional information related to it.
Whether you're creating a new Workbook or updating an existing one, you have the
option to enrich your Workbooks with meaningful metadata.
Previously, when adding a secret to a space through the API, we supported spaces
in secret names. However, the UI had regex logic that filtered out spaces when
validating secret names.
In this update, we've harmonized the process by accepting space characters as
valid characters for secret names in both the API and the UI. Your secret names
can now include spaces without any issues.
We've resolved an issue where long messages would overflow the cell in
validation messages. By implementing a straightforward `word-break` property,
your validation messages will now display lengthy messages more elegantly.
**We’ve added Portuguese (Brazilian) translations!**
Building upon our existing language support (based on your customers’ browser
locale), which already included French, German, and Spanish translations, we’ve
expanded our capabilities to cater to an even broader audience.
Need another language? Let us know and we’ll prioritize getting it added.
**Enhanced Document Customization with Markdown & HTML Support**
Until now, you could only utilize Markdown for your Documents. However, we're
excited to inform you that HTML support has been seamlessly integrated, offering
you even more versatility to tailor your Documents exactly to your liking.
We've revamped the way partial replaces work, enhancing your data editing
experience for maximum smoothness and intuitiveness. In the past, when you
attempted to replace specific characters with an empty string (""), it
occasionally resulted in the cell value becoming null, which wasn't quite what
you had in mind.
For instance, if you needed to eliminate dashes from UUID fields, you'd
naturally want "abc-123" to transform into "abc123" rather than mysteriously
turning null.
We value your feedback. Now, you can confidently perform partial replacements
without the hassle of unexpected null values.
We've made improvements behind the scenes to ensure that your data gets mapped
with more precision. Previously, some data could be missed during mapping, but
now, we're considering both the option labels and values, making sure nothing is
left behind. This means you'll have a smoother and more reliable experience when
mapping your data, ensuring that everything is captured correctly.
**Localization is Now Available**

We are excited to introduce localization features in Flatfile! Now, your
customers will enjoy automatic translations based on their browser locale,
including French, German, and Spanish.
Key Features:
* Translations are exclusively applied in Spaces (guest areas).
* Our user-friendly guide will assist you in effortlessly translating and
personalizing the content within your space, whether it's custom actions,
documents, and more.
* Require another language? Send us a note, and we'll make it a priority to
include it.
[Follow the guide](../guides/localization) or try the
[Localization demo Space](https://platform.flatfile.com/getting-started) to get
started.
**New: Promote Sheet Actions**

Previously, the `primary:true` concept on Actions didn't affect Sheet actions,
leaving them all in the "more actions" drop-down. But now, setting primary to
true will showcase these actions right in the Sheet toolbar, as buttons with
clear, user-friendly text.
You can now effortlessly view up to 500k records at once. When you reach the end
of the table, a helpful message will appear, reminding you that there's more
data waiting. Simply search or filter to reduce the size of your dataset and
access it all. This change was added due to browser limitations that restrict
maximum height in the DOM.
**Introducing: Code Blocks for Documents**

We've introduced the ability to add code blocks to your Documents. You can now
use the following syntax to define code blocks and specify the language:

Your code will be formatted accordingly, enhancing the clarity and presentation
of your content.
Previously, guests without access to the specific workbook would encounter a
"Workbook Not Found" error. Now, a fallback mechanism has been implemented to
validate access to the `primaryWorkbook`, ensuring a smoother experience for
users.
Introducing support for finding empty fields in ffql using the syntax:
```jsx
filter: first_name eq ""
```
Now, you can easily query and filter records with empty values.
Resolved an issue where attempting to upsert an Environment secret that was
already defined within a Space was not functioning as expected. This fix ensures
proper handling when upserting a secret into an Environment after specifying a
Space ID.
Resolved an issue where the import button could prematurely appear before a file
was ready to be imported. This was due to an early update of the file's status
with a workbook ID, which has now been adjusted to wait until the data is
queriable before updating. The import process now aligns better with the file's
readiness.
Experience improved performance (for instance, when scrolling to the bottom of
the data table) with large workbooks as we've optimized query clauses and
updated indexes. Plus, we've seamlessly migrated all existing workbooks to
benefit from these enhancements.
When a file stream becomes unreadable due to file malformation, improvements
were made to ensure that any errors are correctly communicated to the UI.
Experience improved performance (for instance, when scrolling to the bottom of
the data table) with large workbooks as we've optimized query clauses and
updated indexes. Plus, we've seamlessly migrated all existing workbooks to
benefit from these enhancements.
Documents in the sidebar are now organized based on their creation date. This
enhancement helps developers have more control the order of their Documents.
If you have access to multiple Spaces, you'll notice a dropdown menu at the top
left corner of each Space. Previously, there was an issue where all the data
within the Space would update correctly, except for Workbooks/Sheets in the
sidebar. This issue has been successfully resolved.
2 new Demo spaces were added: Documents & Theming. In these demo Spaces, you'll
learn how to:
1. Add Markdown + HTML Documents to your Space
2. Customize the look and feel of Flatfile to match your brand
We've extended the `job:ready` timeout to 10 minutes, aligning it with the
extractor timeout. This adjustment provides more time for all jobs running
within an Agent.
**New Job Outcome Acknowledgements**
**`acknowledge: false`**

By default, job outcomes are reported through a toast notification in the
top-right corner. To utilize this, simply set `outcome => message`. This
approach ensures that your job completion status is promptly communicated to the
end user.
**`acknowledge: true`**

When the `acknowledge` option is configured as `true`, a persistent full-screen
modal is presented. This modal remains visible until the user interacts by
clicking the "Continue" button, acknowledging the outcome.
**Usage**
```jsx
await api.jobs.complete(jobId, {
outcome: {
acknowlege: true,
message: "Text here.",
},
});
```
This enhancement provides flexibility in how you choose to inform users about
job outcomes. [Learn more](../guides/actions#usage).
**Enhanced Job Outcomes with Next Action Links**
Job outcomes have been upgraded to support `next` action links. Now, you can
display links to external URLs, trigger downloads, or retry jobes upon job
completion/failure.
**Usage**
```jsx listener
await api.jobs.complete(jobId, {
outcome: {
next: Url | Download | Retry,
message: "Text here.",
},
});
```
This improvement adds versatility and flexibility to your job outcomes,
enhancing user interaction and experience.
[Learn more](../orchestration/jobs#next-links).
**Enhanced Action Button Behavior**

Two key enhancements to Actions have been introduced:
**Disable Actions When Invalid Records**: Developers can now add an optional
flag to Actions, preventing users from initiating Actions when Sheets or
Workbooks contain invalid records. This helps avoid unnecessary job failures and
ensures Actions are performed on valid data.
```jsx
constraints: [{ type: 'hasAllValid' }],
```
**Disable Actions When No Selected Records:** To enhance the user experience,
we've introduced the `hasSelection` flag. When added as a constraint, this flag
disables Actions if no records are selected in the Sheet or Workbook, ensuring
Actions are only triggered when relevant data is chosen.
```jsx
constraints: [ { type: 'hasSelection' }],
```
[Learn more](../orchestration/actions#optional-parameters).
**Improved FFQL handling of dates and number comparisons**
**Enhanced Number Field Queries:**
When conducting equality or inequality FFQL comparisons for `number` fields, the
query value is now cast to a number and then compared with the parsed "shadow"
value. This rectifies issues related to numeric comparisons. Additionally, we've
resolved a bug where numbers with no digits before the decimal point (e.g.,
".3") were not being properly parsed into shadow values.
**Advanced Date Field Handling:**
For `date` fields, query values are now attempted to be parsed into dates. On
the SQL side, a `CASE` statement is employed to parse the stored values into
dates for accurate comparison. To accommodate SQL-side date parsing
requirements, we've integrated a regex pattern to detect `YYYY-MM-DD`, `m/d/yy`,
and `m/d/yyyy` formats. This ensures correct parsing and comparison of date
values.
**Fix for Invalid Number Input:**
We've resolved a bug where changing a number field to an invalid number left the
previous "shadow" value intact. Now, in such cases, the previous shadow value is
properly cleared, leading to consistent and accurate behavior.
These updates contribute to improved query handling, better data integrity, and
a more seamless experience when working with number and date fields.
Introducing caching for record counts, resulting in improved performance for API
requests.
The `PATCH /sheets` endpoint has been upgraded to accept both an array of
SheetUpdates and an array of SheetConfigs (backwards compatible). This expanded
capability empowers users with more flexible and efficient options when updating
Sheets.
# 
We've introduced a new parameter: `tooltip` to Actions, allowing the addition of
tooltips. These tooltips are now visible in the user interface for both buttons
and list items whenever the associated Action is enabled. This enhancement
provides users with clear and context-aware explanations for enabled Actions,
contributing to an improved overall user experience.
Fixed a cosmetic issue with scrolling through Workbooks. The problem stemmed
from the outer container scrolling based on the inner content's height. By
applying overflow properties to the inner component, which holds a list of items
within a flex container, we have resolved this issue.
**🚀 4 New Example Spaces**
Visit the **Getting Started** page on your Dashboard to discover four new
options for effortlessly generating demo spaces:
1. Namespaces
2. Metadata
3. Egress
4. Sidebar Customization
Similar to the rest of the options, we've provided the underlying code for each
Space, simplifying the process of breaking down and comprehending the elements
on display.
**Authentication Query Enhancement**
The authentication query has been streamlined for optimized performance.
Extensive benchmarking has revealed that this refinement contributes to a
reduction of approximately 20 milliseconds for each request. This enhancement
results in faster overall processing and improved response times.
The **Data Checklist** now includes data types that correspond to each field.
Additionally, the searchable dropdowns are now more user friendly.
A pagination logic issue concerning **Environments** has been resolved. The
correction ensures accurate calculation of the number of pages and consistent
delivery of valid responses in accordance with the
[Pagination](https://flatfile.stoplight.io/docs/api/1dcb99aada63f-pagination)
type.
Furthermore, the default page size for **Spaces** has been set to 10, aligning
it with the specifications outlined in the API documentation.
Previously, attempting to navigate into a dropdown using the keyboard's tab key
was unresponsive. This issue has been addressed, and tabbing via keyboard now
smoothly activates dropdowns, accompanied by a focus outline for the custom
trigger.
The API specification has been updated to facilitate the mapping of `enum`
values of various types such as `string`, `integer`, or `boolean`. This
modification effectively resolves a server error response that was previously
encountered when utilizing such `enum` values within the API.
In addition, the loading state of the "Continue" button has been refined to
ensure smooth recovery from server errors. This adjustment enhances the overall
user experience by providing more graceful handling of unexpected issues during
the process.
**🚀 Instant Extraction for CSV/TSV/PSV Files**
With the removal of the extraction step for CSV/TSV/PSV files, the import
experience is now more seamless than ever. As soon as the upload is complete,
these files are instantly extracted, ensuring an efficient and immediate
handling of your data.
The impact of this change is remarkable. What used to take approximately 3 and a
half minutes to extract now concludes in less than 10 seconds.
In addition, we now natively support TSV and PSV files meaning you don't need to
use an extractor plugin to support these file types.
While speed is the prime advantage, this upgrade doesn’t merely boost
performance. It also simplifies and enhances reliability in our system.
Previously, concerns about only a fraction of a file being extracted are now
history. Furthermore, this approach strategically eases the load on our
database, reducing the likelihood of encountering resource limits.
In essence, it’s a win-win for both efficiency and user experience.
**🚀 A Revamped Starting Point**

Navigate to your Dashboard's Getting Started page to find an array of new
options for effortlessly creating demo spaces. Additionally, we've included the
underlying code for each Space, making it straightforward to deconstruct and
understand what you're seeing.
**🚀 Major improvements to our queue system**
We’ve implemented a state-of-the-art technology to substantially enhance the
reliability and performance of our queue system. This improvement has a profound
impact on the execution of asynchronous tasks, like data validation.
Now, you can set a description for each Option Field value via API. End users
can then view this description as a tooltip during mapping.
We've added new API routes to capture up to 5 snapshots of a Sheet, with the
flexibility to restore any of them using the API at your convenience. Stay tuned
as we prepare to bring this functionality to the UI as well.
A Sheet with no access enabled (`access:[]`) now shows a lock icon on the Sheet
tab.
We opted to hide the files count on the Files page if there are no files.
A shrimp size update to the colors of the sidebar toggle so it looks good with
all themes.
Info inside `metadata` > `userInfo` will now display in the Spaces list.
Metadata allows you to store and retrieve additional data about a Space, Record,
or Field without exposing it to end users. Now, Environments also have a
`metadata` string array.
We made a small fix to allow better vertical scrolling in the Dashboard sidebar.
CTRL + Z wasn't working for a minute. This is now fixed.
If a boolean cell is empty, we only show the toggle on hover now.
We were seeing issues where corrupt files that ran through extraction did not
fail but simply skipped over the lines affected. This is now resolved.
**🚀 Introducing Command+k Triggered Search and Transformation (Early Access)**

This update enhances your workflow by centralizing experiences under the
Command+k shortcut.
A straightforward forward slash, followed by these options, grants access to our
data transformation and query tools:
* `/transform` (AI-powered)
* `/query` (AI-powered)
* `/search` (global search)
* `/filter` (Flatfile Query Language, ffql)
* `/in` (field-specific search)
Chat with us or email [support@flatfile.com](mailto:support@flatfile.com) to
have this feature flagged on in your Account today!
There is now a count at the top of the files list that shows the total number of
files.

The Sidebar now has three different states:
* Collapsed state
* Open state
* Totally hidden state
Additionally, the Sidebar will now automatically collapse itself on smaller
screens.
We now handle overflowing space name(s) by:
* Breaking the word
* Limiting the name to two lines and giving it an ellipsis if it overflows
* Adding a tooltip with the space name to truncated names
* Previously, badge names in the data checklist could break to two lines making
them hard to read. This is also fixed.
When determining the lighter shades of each main theme colors, i.e. primary,
danger, warning and success, there are now checks to ensure no colors end up as
white.
The filename was added to the mapping scene so users can now see which file they
are currently mapping.
Cells with no data can now be unmapped using “Do not import”, if previously
mapped to a value.
Filter: "Last Name" like A% will now retrieve all records where Last Name starts
with “A”. Previously, it was retrieving records that contained “A”.
The continue button is now disabled on mapping scene while mappings are loading.
This ensures user can only advance to review scene after mappings are saved.
# SDKs
Source: https://flatfile.com/docs/changelog/sdks
Notable additions and updates to Flatfile core libraries
As of version 3.9.0, the Flatfile CLI now includes TypeScript types (if available) and package.json in the source map when deploying an agent. This enhancement improves debugging capabilities and provides better type information in the deployed code, making it easier to troubleshoot issues in production environments.
`@flatfile/react@7.12.4`
Fixes a bug when relaunching a re-used space. Resolves an issue causing the
"Maximum update depth exceeded" React error when multiple Portals are used
within the same FlatfileProvider (this is not supported). Introduces changes to
the FlatfileContext and FlatfileProvider components to handle the onClose
callback, using a mutable ref instead of state. Refactors the resetSpace
function and introduces changes to handle the resetOnClose config option when
re-using a space.
`@flatfile/react@7.12.3`
The release notes include a fix for a bug with the Workbook onSubmit
functionality. Specifically, it addresses an issue where the onSubmit action was
not being correctly added or updated in the workbook actions list. The fix
ensures that the onSubmit action is properly added or updated when the onSubmit
prop is provided, and that duplicates are not created if the action already
exists. This change is relevant for developers using the Workbook component and
expecting the onSubmit callback to be triggered when the workbook is submitted.
`@flatfile/javascript@1.4.2`
The JavaScript Portal style sheet is now attached only once, instead of every
time the Portal is opened. This should improve performance and avoid redundant
style sheet insertions.
A new option has been added to allow specifying submit complete options, giving
more control over the behavior when submitting a job. Developers can now set the
'acknowledge' and 'message' properties when completing a job submission.
This would result in the job being completed with 'acknowledge' set to false and
'message' set to 'Custom complete message'.
`@flatfile/react@7.12.7`
Added option `submitSettings.startingJobMessage` to configure the initial
message displayed in the job progress modal.
`@flatfile/react@7.12.2`
The release includes a new `removeSheet` method to allow removing a specific
sheet from the workbook by its sheet slug.
`@flatfile/react@7.12.1`
This release updates the @flatfile/react package to version 7.12.1. The update
fixes a bug where the space was not completely reset upon closing the Flatfile
Portal. Specifically, the changes ensure that when the Flatfile Portal is
closed, the internal session space is reset, preventing any state from
persisting between sessions.
`@flatfile/javascript@1.4.1`
The startFlatfile function has been exported from the package, allowing
developers to invoke it directly. Additionally, the FlatfileClient type has been
exported from the @flatfile/api package, providing developers with access to
this type when using the @flatfile/javascript package.
`@flatfile/react@7.12.0`
This release includes the following changes:
A new `onClose` event handler has been added to the `useFlatfile` hook, allowing
developers to specify a callback function that is called when the Flatfile modal
is closed.
The missing stylesheet for the legacy `useSpace` and `usePortal` flows has been
restored, ensuring proper styling for these deprecated hooks.
By default, the Flatfile modal now fills the entire screen. Developers may need
to adjust any style overrides to accommodate this change.
Additionally, a bug that prevented server-side configuration of Spaces has been
fixed.
`@flatfile/react@7.11.0`
The CSS variable `--ff-color-text` has been renamed to `--ff-text-color` to be
consistent with other variable names. This change affects the text color used
throughout the Flatfile React components.
`@flatfile/javascript@1.4.0`
This exciting release adds internationalization support to the Flatfile
JavaScript SDK. The SDK now detects the user's browser language or allows
specifying an override language. Translation strings are provided for the
confirmation modal, error modal, and other UI elements. The SDK also includes
functionality to handle missing translation keys and log them to the console.
Additionally, this release updates the modal logic to use functional components
and make the text strings dynamic based on the provided translations or
defaults.
A new util function `initializeIFrameConfirmationModal` is introduced to handle
mounting the confirmation modal and its associated behaviors onto the iFrame
wrapper element. This function takes callbacks for the modal's button click
actions (exit or stay) and text string providers for the modal's title, message,
and button labels.
`@flatfile/react@7.10.0`
The `FlatfileProvider` component now accepts a new `styleSheetOptions` prop
which allows setting the nonce value and position of the injected stylesheet.
This gives more control over how the styles are applied.
The package now contains utility functions `attachStyleSheet` and `styleInject`
to dynamically inject the stylesheet into the DOM at runtime. This replaces the
previous import of the stylesheet directly.
`@flatfile/javascript@1.3.9`
The createlistener function now takes an additional `onClose` parameter. This
function is used to handle the closing of the Flatfile space, and it removes the
event listener and cleans up the DOM elements associated with the Flatfile
space. The `initializeIFrameConfirmationModal` function has also been updated to
use the new `closeSpaceNow` function, which simplifies the process of closing
the Flatfile space.
`@flatfile/javascript@1.3.8`
A bug fix for issues with the `closeSpace.onClose()` function. This function is
called when the Flatfile space is closed, and the bug fix ensures that it works
correctly.
An improvement to the closing behavior of the Flatfile space. Instead of
removing the iframe element immediately, a new `closeSpaceNow` function is
introduced, which handles removing the necessary elements and event listeners in
a more organized manner.
There is also an update to the way the confirmation modal is displayed when
closing the Flatfile space. Instead of using a separate event listener, the
closing logic is now handled within the existing `closeFlatfileSpace` function,
which listens for the `job:outcome-acknowledged` event and triggers the closing
process accordingly.
`flatfile@3.7.0`
The latest release of the flatfile package includes improvements for handling
file uploads and inferring event topics from arrays. Listeners using an array
syntax `listener.on(['commit:created', 'layer:created'], (event) => {}` will now
correctly infer the event topics.
This release also includes a fix for handling file uploads with updated event
`flatfile.upload`
`@flatfile/react@7.9.9`
This release includes a fix to address instances where the iframe wasn't ready
when the `portal:initialize` call was posted. The fix ensures that the
`portal:initialize` message is sent to the iframe only after it has finished
loading. This resolves an issue where the initialization process could sometimes
fail due to the iframe not being ready. Developers using the `@flatfile/react`
package will benefit from a more reliable initialization process for the
embedded Flatfile experience.
The new `useIsIFrameLoaded` hook listens for the `load` event on the iframe and
updates a state variable accordingly. The `portal:initialize` message is then
sent only when the iframe has finished loading, ensuring a reliable
initialization process.
`@flatfile/javascript@1.3.7`
This release adds the ability to set the default page when preloading the
Flatfile Portal embed. Use `updateDefaultPageInSpace` and `findDefaultPage` to
specify which workbook, sheet, or document page should display initially.
`@flatfile/listener@1.0.5`
The `unmount()` method has been added, allowing developers to cleanly disconnect
the event listener from the event driver. The `EventDriver` class now includes
`unmountEventHandler()` to remove attached event handlers, providing more
control over event listeners.
`@flatfile/javascript@1.3.4`, `@flatfile/javascript@1.3.6`,
`@flatfile/listener@1.0.4`, `@flatfile/react@7.9.3`
The `closeSpace.operation` and `closeSpace.onClose` properties are now optional,
offering more flexibility. The `@flatfile/embedded-utils` dependency is updated
to version 1.2.1, simplifying event listener setup with `handlePostMessage`.
`@flatfile/listener` now has optimized build tooling and updated dependencies,
improving compatibility and performance.
`@flatfile/javascript@1.3.3`, `@flatfile/react@7.9.2`
This release enables clipboard read and write functionality within the embedded
Flatfile iframe component. Also, `@flatfile/embedded-utils` is updated to
version 1.2.0.
`@flatfile/react@7.9.1`
This release improves the submission process. After data upload, the `onSubmit`
action handlers now acknowledge the job automatically, enhancing user feedback.
`@flatfile/listener@1.0.2`, `@flatfile/react@7.9.0`
`@flatfile/listener` now supports removing event listeners with `off()` and
`detach()` methods. The `FlatfileProvider` component sets up a `FlatfileContext`
with configurations for `Space`, `Workbook`, `Document`, and event handlers.
`@flatfile/javascript@1.3.2`, `@flatfile/react@7.8.12`
Updates to dependencies and improvements in performance for both packages.
`@flatfile/embedded-utils` dependency updated to 1.1.14.
`@flatfile/javascript@1.3.1`, `@flatfile/react@7.8.11`
Updated `@flatfile/embedded-utils` to version 1.1.13. Simplified logic for
setting `sidebarConfig` when initializing a new Flatfile import space.
`@flatfile/react@7.8.10`
Fixed a bug in the `usePortal` hook related to the `onSubmit` function, ensuring
reliable unmounting of the iframe.
`@flatfile/javascript@1.3.0`, `@flatfile/react@7.8.9`
Added `initNewSpace` utility to create a new space, workbook, and document in a
single API request. The `environmentId` parameter is now optional.
`@flatfile/react@7.8.8`
Fixed an issue where the embedded Flatfile iframe would not be removed properly
if there was an error during initialization.
`@flatfile/javascript@1.2.6` `@flatfile/react@7.8.7`
Updated the space creation request to only include defined parameters. Improved
reliability and efficiency of data import experience.
`@flatfile/javascript@1.2.5` `@flatfile/react@7.8.6`
`startFlatfile` function now accepts new optional parameters, including
namespace, metadata, labels, translation path, and language override, providing
more customization.
`@flatfile/javascript@1.2.3` `@flatfile/javascript@1.2.4`
`@flatfile/react@7.8.4` `@flatfile/react@7.8.5`
Updated default spaces URL to avoid unnecessary preflight requests. Improved
support for nested entrypoints in `package.json`.
`@flatfile/react@7.8.3`
Fixed bug related to updating the authentication token for the
`@flatfile/listener` package, ensuring secure and authorized communication.
`@flatfile/javascript@1.2.2` `@flatfile/listener@1.0.1` `@flatfile/react@7.8.2`
Added exports and browser fields in `package.json` for better module resolution
and compatibility. Updated dependencies for improved stability.
`@flatfile/javascript@1.1.7` `@flatfile/javascript@1.2.0`
`@flatfile/javascript@1.2.1` `@flatfile/react@7.8.0` `@flatfile/react@7.8.1`
Removed global style overrides from SDKs, improving compatibility. Updated
dependencies and improved type support for better integration experience.
`@flatfile/javascript@1.1.6`
Added functionality to remove the event listener after submit and close actions,
improving memory management.
`@flatfile/javascript@1.1.5` `@flatfile/listener@0.4.2`
The `@flatfile/javascript` package update includes a better-organized
package.json and fixes for type conflicts with peer dependencies. These changes
ensure a smoother development experience and easier integration. The
`@flatfile/listener` package resolves type conflicts with peer dependencies and
improves type compatibility.
`@flatfile/javascript@1.1.4`
The confirmation modal's close action is now independent of user parameters,
ensuring the modal is consistently removed from the DOM when closed.
`@flatfile/javascript@1.1.3` `@flatfile/listener@0.4.1`
The `@flatfile/javascript` package consolidates source code, generates type
declarations, and re-exports modules for convenience. The `@flatfile/listener`
package update enhances exports and adds dotenv dependency for better
environment variable management.
`@flatfile/javascript@1.1.2`
This release fixes an issue when reusing an existing Flatfile space and adds
support for passing an existing space object to `startFlatfile`.
`@flatfile/javascript@1.1.0`
The new release introduces preloading the Flatfile embed iFrame, initializing
without a space ID, and adding additional metadata when creating a space. The
pubnub dependency has been removed.
Updated to version 1.1.1 with a bug fix and dependency updates for improved
reliability.
`@flatfile/react@7.5.4` `@flatfile/react@7.6.0`
The `@flatfile/react` update bundles `@flatfile/api` and removes the pubnub
dependency. The package now listens for window message events directly.
`@flatfile/javascript@0.3.2` `@flatfile/listener@0.4.0` `@flatfile/react@7.5.3`
Updates include better browser compatibility, gzipping request bodies, and using
Rollup for bundling. Dependencies for `@flatfile/react` are also updated for
better performance and reliability.
`@flatfile/react@7.5.2`
The `@flatfile/react` package removes the styled-components dependency,
migrating styles to SCSS for improved performance and maintainability.
`@flatfile/listener@0.3.19`
The `@flatfile/listener` package now compresses the request body using pako
compression by default when calling the `update()` method. This reduces network
bandwidth usage and transfer times.
`@flatfile/listener@0.3.18`
The `EventCallback` type now expects a Promise to be returned, allowing
developers to use async/await for greater flexibility and control over
asynchronous operations.
`@flatfile/javascript@0.3.1` `@flatfile/react@7.5.1`
The UMD build file for the `@flatfile/javascript` package is renamed to
`dist/index.js`. The `@flatfile/react` package update simplifies the import
process by using `dist/index.js` for browser environments.
`@flatfile/javascript@0.3.0` `@flatfile/react@7.5.0`
The `@flatfile/javascript` package includes bundle optimizations and a new
option to keep the space after submitting data. The `@flatfile/react` package
improves accessibility with the close button's `type="button"` attribute and
adds the option to keep the space active after submission.
`@flatfile/javascript@0.2.2` `@flatfile/react@7.4.0`
The `@flatfile/javascript` package introduces a new Simplified React Flow and
updates dependencies. The `@flatfile/react` package adds a `usePortal` hook for
a smoother integration experience.
`@flatfile/react@7.3.1`
The `useSpaceTrigger` hook has been renamed to `initializeFlatfile`, and
`createOrUseSpace` has been renamed to `OpenEmbed` for better clarity.
`@flatfile/javascript@0.2.1`
Re-introduces the `initializeFlatfile` method, fixing issues with the submit
action and simplifying the user experience.
`@flatfile/javascript@0.2.0`
Simplifies the integration flow by allowing developers to provide sheet
configurations directly and introducing new utilities for record-level hooks and
job submissions.
`@flatfile/react@7.3.0`
The new `useSpaceTrigger` hook allows developers to control when the Flatfile
import experience is triggered.
`@flatfile/javascript@0.1.27` `@flatfile/react@7.2.34`
All parameters are now seamlessly passed to the API when creating a Workbook
client-side.
`@flatfile/react@7.2.32` `@flatfile/react@7.2.33`
Removed `styled-components` from peer dependencies and resolving an issue with
the change.
`@flatfile/listener@0.3.17`
Resolved an issue where `axios` and its associated HTTP methods were
unavailable.
`@flatfile/javascript@0.1.25`
* Implemented UMD build type for seamless compatibility with Content Delivery
Networks (CDNs).
* Updated dependencies: `@flatfile/listener@0.3.16`
`@flatfile/listener@0.3.16`
Adds UMD build type for working with CDNs.
`@flatfile/react@7.2.31` `@flatfile/javascript@0.1.24`
* Enhanced performance by updating package versions and removing unused
dependencies
* Updated dependencies: `@flatfile/embedded-utils@1.0.7`
`@flatfile/react@7.2.29-30` `@flatfile/javascript@0.1.22-3`
Implemented the use of `rollup.js` for bundling purposes, enhancing the
project's build process.
`@flatfile/react@7.2.28` `@flatfile/javascript@0.1.21`
* Updated dependency: `@flatfile/api@1.5.31`
`@flatfile/javascript@0.1.20`
Streamlined the project by removing the unnecessary `dotenv` dependency,
resulting in a cleaner and more efficient codebase.
`@flatfile/javascript@0.1.19` `@flatfile/react@7.2.27`
When creating a Space, you may like to receive the `spaceId` that was created to
tie it back to something in your platform. You can now receive this in the
Javascript SDK. This functionality is coming soon for the React SDK.
`@flatfile/javascript@0.1.18`
When creating a Space, you may like to receive the `spaceId` that was created to
tie it back to something in your platform. You can now receive this in the
Javascript SDK. This functionality is coming soon for the React SDK.
`@flatfile/javascript@0.1.17` `@flatfile/react@7.2.24`
In this version, we've made two important updates:
**Auto-Added Labels for Embedded Spaces:** Embedded spaces now come with
auto-generated labels (that will be displayed in your Spaces Dashboard) for
easier navigation and organization.
**Listener loading Reordered**: The listener is now created before the Workbook,
allowing you to listen for workbook creation events within your listener for
more streamlined and effective integration in client-side listeners.
`@flatfile/javascript@0.1.13`
You can now use `@flatfile/api` inside a client-side listener without needing a
secret key. The listener will instead use the `accessToken` created from your
`publishableKey`.
`@flatfile/javascript@0.1.11` `@flatfile/react@7.2.21`
In this version, we've made two important updates:
1. We've changed the default setting for `sidebarConfig` to hide the sidebar,
providing a cleaner and more focused workspace.
2. Now, when no `workbook` is set, `autoConfigure` is automatically turned on
for Space creation. This enables you to listen for the `space:configure` job
in a server side listener.
`@flatfile/plugin-record-hook@1.1.2`
In this update, we've resolved a pesky bug that caused the message generated by
`recordHook` to be overwritten when `bulkRecordHook` was also in use.
Our solution? We've introduced caching, ensuring that both messages can coexist
harmoniously.
`@flatfile/react@7.2.20` `@flatfile/javascript@0.1.8`
@flatfile/react\@7.2.20 & @flatfile/javascript\@0.1.8
This version update introduces proper class names on the Error container so you
can more easily control the look and feel of this component.
See all classes that can be overridden in CSS in the
[Reference](/apps/embedding/reference/advanced#iframe-styles).
`@flatfile/react@7.2.18`
This version introduces the following notable updates:
* **Enhanced Close Dialog**: Class names have been incorporated for all elements
within the close dialog, which becomes visible when closing the embedded
Flatfile component. This improvement gives you more control on styling.
* **Utilization of Shared Utilities**: The integration now utilizes the shared
`@flatfile/embedded-utils` module. This change lays the groundwork for
consistent properties to be shared across all Flatfile wrapper SDKs, promoting
standardization and ease of maintenance.
* **Dependency Refinement**: The dependencies on `vite` and
`@flatfile/configure` have been removed. This streamlines the codebase and
reduces unnecessary dependencies, contributing to a more efficient and
lightweight integration.
`@flatfile/react@7.2.17`
An improvement was made to add the `userInfo` param on `metadata` in a more
comprehensible way.
`@flatfile/api@1.5.21`
The response type for the `getRecordsAsCSV` endpoint was incorrectly typed as a
string instead of a file. This fixes downloading a sheet as a CSV via SDk.
`@flatfile/listener@0.3.15`
This version has four great updates:
* Previously, you had to fake-out TypeScript because the event time was not
exposed (it was a private class variable) for FlatfileEvents. We have now
added `createdAt` to the FlatfileEvent.
* Axios responses 200-399 now no longer throw Errors.
* Created a secrets cache based on `spaceId`.
* Previously, if your listener fetched secrets for multiple environments, those
maps would entirely override each other.
* If your listener fetched `spaceId` overrides first, and then environment --
you would get the space id overrides.
* We now make sure that the fetch URL in the listener is formatted properly
whether or not the user puts a trailing slash in the environment variable.
`@flatfile/listener@0.3.14`
`event.secrets` now defaults to `spaceId`. It will throw if no `environmentId`
is available for secrets retrieval.
`@flatfile/react@7.2.15`
You can now pass in a spacebody. This will allowyou to send more than any
already-defined parameters allowed on space creation.
`@flatfile/react@7.2.14`
The close button and iframe now have a proper class name to allow for CSS
overrides.
# CLI
Source: https://flatfile.com/docs/changelog/sdks/core/cli
The Flatfile CLI is a command-line tool that simplifies the integration process with Flatfile by providing developers with commands to manage and configure their integration from their local environment.
loading...
# Listener
Source: https://flatfile.com/docs/changelog/sdks/core/listener
Listener combines event handling capabilities with Flatfile-specific functionality, such as working with Flatfile records and sessions. Simply put, it receives events and it responds to events.
loading...
# Plugins
Source: https://flatfile.com/docs/changelog/sdks/plugins/highlights
Notable additions and updates to Flatfile plugins
A new Data Checlist Document type has been added to the `@flatfile/plugin-space-configure` package, providing an automated way to include a Data-Checklist document in your space.
To create a Data Checklist Document simply add the `data-checklist` treatment to your document.
```ts
import { configureSpace } from "@flatfile/plugin-space-configure"
export default function (listener) {
listener.use(
configureSpace({
workbooks: [...]],
documents: [
{
title: "Data Quality Checklist",
body: "# Data Quality Requirements\n\nBelow is the current status of your data:",
treatments: ["data-checklist"]
}
]
})
)
}
```
`@flatfile/plugin-automap@0.7.0` The release updates the
@flatfile/plugin-automap package to version 0.7.0. The bundler for the package
has been swapped from Parcel to tsup. The @flatfile/util-common dependency has
been updated to version 1.5.0. The package now uses an ES module build with
different entry points for Node.js and browser environments. The build scripts
have been updated to use tsup instead of Parcel. Additional scripts for linting
and checking have been added. The package.json file has been updated with the
new build output paths and formats. The Jest configuration file has been renamed
from jest.config.js to jest.config.cjs to support the ES module build. For
developers using this package, the import paths may need to be updated based on
the new module entry points.
@flatfile/plugin-convert-openapi-schema\@0.3.2 Updated the plugin to version
0.3.2. This release fixes an async/await bug in the plugin. The plugin now
properly handles asynchronous operations related to configuring the space with
OpenAPI schema. The changes include updating the index.ts file to use the
configureSpace function correctly and removing unnecessary code. The
setup.factory.ts file was refactored to handle undefined data more gracefully
when generating fields.
@flatfile/plugin-convert-yaml-schema\@0.3.2 This release includes an important
bug fix and updates several dependencies. The async/await bug in the plugin has
been resolved. Additionally, the dependency on
@flatfile/plugin-convert-json-schema has been updated to version 0.4.2. The
package.json file has been updated with new script commands for running
different types of tests (unit, e2e). The plugin's entry point (index.ts) has
been refactored to simplify the configureSpaceWithYamlSchema function, which now
uses the configureSpace function from the @flatfile/plugin-space-configure
package. The setup.factory.ts file has been updated to return a Setup object
instead of a SetupFactory, and the generateSetup function now handles schema
processing and field generation more efficiently.
@flatfile/plugin-convert-sql-ddl\@0.2.2 This release updates the
@flatfile/plugin-convert-sql-ddl package to version 0.2.2. It includes a fix for
an async/await bug. The configureSpaceWithSqlDDL function has been refactored to
use the configureSpace function from @flatfile/plugin-space-configure,
simplifying the implementation. The generateSetup function has been updated to
remove the source property from the workbook object and return a Setup type
instead of a SetupFactory type.
@flatfile/plugin-convert-json-schema\@0.4.2 This release fixes an async/await bug
in the plugin. Additionally the generateSetup function has been refactored to
handle cases where the data or data.properties is undefined, and the
partialSheetConfig.source property is no longer deleted.
@flatfile/plugin-automap\@0.6.0 This release removes the remeda dependency and
adds the modern-async dependency. It also introduces a new option
disableFileNameUpdate which disables updating the file name with automatic
mapping information. The file name update behavior previously occurred
unconditionally, but now can be controlled via this new option. There are no
changes to the external interface beyond the addition of this new option.
`@flatfile/plugin-delimiter-extractor@2.2.3`
This release corrects the DelimiterOption import.
`@flatfile/plugin-xlsx-extractor@3.1.5`
This release fixes an issue in header row logic that erred when the header row
contained non-string values and the `raw` or `rawNumbers` params were set to
true.
`@flatfile/plugin-automap@0.4.0`
Updated listener to job:updated from job:created to account for the mapping plan
now being run asynchronously. Updated the verfiyConfidentMatching from every to
some to enable more concice autoMapping
`@flatfile/plugin-xlsx-extractor@3.1.2`
`@flatfile/plugin-delimiter-extractor@2.1.1`
`@flatfile/plugin-xml-extractor@0.6.1` `@flatfile/plugin-json-extractor@0.8.1`
Extractor plugins now escape keys with special characters.
`@flatfile/util-response-rejection@1.3.4`
An adjusment has been made to ensure proper order when response rejection is
used with a Record Hook.
`@flatfile/plugin-xlsx-extractor@3.1.0`
`@flatfile/plugin-delimiter-extractor@2.1.0`
`@flatfile/plugin-xml-extractor@0.6.0` `@flatfile/plugin-json-extractor@0.8.0`
A small change has been made in the extractors to ensure the normalization of
extracted column keys.
`@flatfile/plugin-xlsx-extractor@3.0.0`
A major change has been made where the 'required' field check from extractors
and the subsequent `field.constraints.required` has been removed. This is a
breaking change if you were using this required field check in your
implementation. Source sheets no longer require the
`field.constraints.required`. If you need to implement this functionality, you
can use a listener to add the `field.constraints.required` to the specific
fields you need.
`@flatfile/plugin-delimiter-extractor@1.0.0`
`@flatfile/plugin-xml-extractor@0.5.17` `@flatfile/plugin-xlsx-extractor@2.0.0`
`@flatfile/plugin-json-extractor@0.7.6`
Extractor packages now include new features to support header selection.
Developers can enable header selection, allowing the plugin to automatically
detect and return letter headers (e.g. A, B, C) representing column indexes
instead of using the row data as headers. The release also includes dependency
updates and adds a new metadata property to the parser output containing
information about detected row headers when header selection is enabled.
This powerful new header selection capability streamlines working with
delimiter-separated data files lacking explicit column headers.
`@flatfile/plugin-job-handler@0.5.4`
In this release, the @flatfile/plugin-job-handler package introduces improved
error handling when the callback function returns an invalid job outcome. The
package now gracefully handles errors thrown by the callback function, providing
better resilience and reliability for developers using the package.
Specifically, if the callback function throws an Error, the job is now
automatically marked as failed, with the error message included in the job
outcome. This enhancement ensures that developers receive clear feedback when
issues arise during job processing, making it easier to debug and address any
problems. With this improvement, the @flatfile/plugin-job-handler package offers
a more robust and user-friendly experience for handling jobs and processing job
outcomes.
For example, if the callback function encounters an error like:
```typescript
throw new Error("Failed to process job data");
```
The package will automatically mark the job as failed with the provided error
message:
```json
{
"info": "Failed to process job data",
"outcome": {
"acknowledge": true,
"message": "Failed to process job data"
}
}
```
This release streamlines error handling, providing developers with a more
seamless and reliable experience when working with the
@flatfile/plugin-job-handler package.
The latest release of the @flatfile/plugin-dedupe package introduces an improved
error message when the dedupe plugin is incorrectly called from a workbook-level
action instead of a sheet-level action. This helpful error message provides a
clearer indication to developers using the package, guiding them to correctly
configure the dedupe plugin as a sheet-level action.
`@flatfile/plugin-constraints@2.0.0` `@flatfile/plugin-record-hook@1.6.0`
`@flatfile/plugin-autocast@1.0.0`
The latest release of the core packages adds support for the new string-list and
enum-list field types, allowing developers to work with these data types
seamlessly within their applications. With this release, developers can now
define constraints that validate string-list and enum-list field types,
providing greater flexibility and control over data validation processes. This
new capability streamlines the development workflow and ensures data integrity
across various use cases.
`@flatfile/plugin-job-handler@0.5.3`
In this release of the @flatfile/plugin-job-handler package, the starting
progress of the jobHandler function has been changed from 10% to 1%. This
adjustment provides a more accurate representation of the initial progress state
when using the jobHandler function. Developers can now expect the progress to
start at 1% instead of 10%, allowing for a smoother and more intuitive user
experience when handling jobs with this package.
`@flatfile/plugin-export-workbook@0.3.0`
This release adds a new optional parameter `autoDownload` to the
`@flatfile/plugin-export-workbook` package. When set to true, this parameter
will automatically download the exported Excel file after it has been uploaded
to Flatfile.
For example:
```js
listener.use(exportWorkbookPlugin({ autoDownload: true }));
```
This gives developers more flexibility in how they handle the exported file.
`@flatfile/plugin-constraints@1.2.1`
In this release, the @flatfile/plugin-constraints package has been updated to
version 1.2.1. This update brings improved bundling for plugins and ensures
compatibility with the latest dependencies. Developers can now enjoy a more
seamless integration experience when using this powerful constraints plugin with
their Flatfile applications. Additionally, the release addresses a crucial fix
related to bundling plugins, ensuring smooth operation and enhancing the overall
developer experience. With these improvements, the @flatfile/plugin-constraints
package becomes even more robust and reliable, empowering developers to
effortlessly extend their blueprints with external constraints.
Some key changes include:
* The bundling process for plugins has been optimized, resolving any previous
issues and ensuring a smooth integration experience.
* Dependencies have been updated, including the @flatfile/plugin-record-hook
package, which is now at version 1.5.2, ensuring compatibility with the latest
releases.
To take advantage of these enhancements, developers simply need to update to the
latest version of the @flatfile/plugin-constraints package. For example:
```
npm install @flatfile/plugin-constraints@1.2.1
```
Unlock the full potential of your Flatfile applications with this improved and
streamlined constraints plugin!
`@flatfile/plugin-space-configure@0.5.0`
In this release of the @flatfile/plugin-space-configure package, we've added a
feature that allows you to maintain the order of workbooks in the Space UI
sidebar. This means you can now control the order in which your workbooks
appear, providing a more intuitive and customized experience for your users. To
enable this functionality, simply include the `maintainWorkbookOrder` option set
to `true` when configuring your space. The workbook order will then be preserved
based on the order in which the workbooks are created. This enhancement gives
you greater control over the Space UI, allowing you to tailor it to your
specific needs and provide a seamless user experience.
`@flatfile/plugin-graphql-schema@1.1.1`
This release enhances the @flatfile/plugin-graphql-schema package with a new
feature. The generateSheets function now allows filtering of GraphQL objects by
providing Sheet slugs. This improvement enables developers to have more granular
control over the mapping between GraphQL objects and Flatfile sheets, improving
flexibility and customization capabilities. For example, developers can now
specify which GraphQL objects should be included or excluded from the sheet
generation process based on their slugs:
```typescript
const sheetConfig = sheetConfigArray?.find(
(config) => config.slug === object.name,
);
if (sheetConfigArray?.length > 0 && !sheetConfig) return;
```
With this update, the @flatfile/plugin-graphql-schema package becomes even more
powerful, offering developers a streamlined experience when working with GraphQL
schemas and Flatfile blueprints.
`@flatfile/plugin-record-hook@1.5.0`
RecordHooks now include readonly configurations. This new feature gives you more
control over how data is presented and modified in your application. Setting a
record as readonly prevents any changes from being made, while marking
individual fields as readonly allows editing for the remaining fields. To
utilize this functionality, simply call the new setReadOnly() method on a record
instance, optionally passing in field keys to specify which fields should be
readonly.
For example:
```js
record.setReadOnly("age", "name");
```
This would make the 'age' and 'name' fields readonly for that record. This
release enhances flexibility and strengthens data integrity, enabling you to
build even more robust applications with Flatfile's powerful data management
tools.
`@flatfile/plugin-xlsx-extractor@1.11.6`
This release enhances the xlsx extractor plugin with improved debug messaging
and a new option for configuring header detection. When running in debug mode,
developers will now see helpful log messages indicating which sheet is being
processed and what headers were detected. Additionally, a new
`headerDetectionOptions` option has been added to the `ExcelExtractorOptions`
interface, allowing consumers to customize the algorithm used for detecting
headers in their Excel files. This gives users more control over how headers are
identified, improving compatibility with different spreadsheet formats. Overall,
these changes provide better visibility into the extraction process and more
flexibility for handling diverse Excel data sources.
**🚀 GraphQL to Flatfile Blueprint is now available with
@flatfile/plugin-graphql-schema\@1.0.0 🚀**
`@flatfile/plugin-graphql-schema` provides a robust solution for leveraging
GraphQL data structures within Flatfile, enhancing data management capabilities
and adaptability.
* Dual Functionality: The newly released GraphQL to Flatfile Blueprint plugin
offers versatile integration options for your Flatfile workspace. Use it to
either generate a Flatfile Blueprint directly from GraphQL or configure a
Flatfile Space using GraphQL, streamlining your data integration processes.
* Flexible Input Sources: The plugin supports multiple methods for supplying
GraphQL data:
* API Endpoint: Connect directly to a GraphQL API to fetch data.
* Schema File: Upload a GraphQL schema file.
* Custom Callback: Implement a custom callback function for advanced data
handling and customization.
See the [docs](https://www.npmjs.com/package/@flatfile/plugin-graphql-schema).
`@flatfile/util-common@1.1.1`
The common utils now include a new `deleteRecords` utility. This streamlines the
creation of the delete records job, which is a common use case and more
appropriate for bulk deletion of records.
`@flatfile/plugin-dedupe@1.0.0`
The deduplication capability of @flatfile/plugin-dedupe has been expanded beyond
the initial 10,000 records limit. Now, it effectively dedupes the entire sheet,
ensuring comprehensive data integrity across larger datasets. Additionally, with
the latest update, @flatfile/plugin-dedupe can create bulk record deletion jobs,
streamlining the process of removing duplicates and optimizing performance.
`@flatfile/util-common@1.0.1`
This release replaces the `api.records.get` on @flatfile/api with a fetch to the
GET records endpoint. This significantly improves large response times.
`@flatfile/xlsx-extractor@1.11.2`
This update addresses an issue with how columns were managed when encountering
blank header columns. Previously, the last column was incorrectly removed in
such cases. Now, the column with the blank header is precisely identified and
removed, ensuring more accurate data representation and integrity in your
workflows.
`@flatfile/plugin-foreign-db-extractor@0.0.2`
* Improved Error Handling: This release enhances the error management
capabilities of the plugin, ensuring more robust and informative error
responses during data extraction processes.
* Accurate File Status Updates: Upon successful job completion, the file status
is now correctly updated, providing clear visibility and tracking of job
outcomes.
* Enhanced Database Availability Checks: Recognizing the potential delays in
database readiness, the plugin now includes polling mechanisms for database
users. This ensures that operations proceed only when the restored database is
fully available, enhancing the reliability of data interactions.
`@flatfile/plugin-record-hook@1.4.5`
This release brings improvements and optimizations to enhance performance and
functionality: **Smart Change Detection:** The plugin now intelligently compares
original and modified values to identify actual changes. If there’s no change (a
no-op situation), it avoids unnecessary record updates, enhancing efficiency and
better supporting trackChanges.
**UMD Build Availability:** A UMD build is now included, expanding compatibility
across different module systems and environments.
**Slimmer Node.js Build:** The Node.js build has been optimized to reduce its
size, improving load times and resource usage.
**Removal of Axios Dependency:** We’ve eliminated the Axios dependency in favor
of native solutions, streamlining the library and reducing external
dependencies. Custom Concurrency Control: The plugin now implements its own
concurrency control mechanism, moving away from reliance on external libraries.
This bespoke solution is tailored to the specific needs of record hook
operations, enhancing stability and performance.
These updates mark a step forward in optimizing @flatfile/plugin-record-hook for
developers and ensuring seamless integration into your data processing
workflows.
[Learn more in the docs](https://flatfile.com/plugins/plugin-record-hook).
`@flatfile/plugin-export-workbook@0.1.6`
This update introduces several enhancements to improve your data export
workflows:
**Sheet-Specific Exports:** Users now have the flexibility to export individual
sheets within a workbook, providing greater control over data management and
distribution.
**Customizable Job Names:** To further personalize your workflow, this version
allows you to specify custom job names, offering an alternative to relying on
the default naming convention.
**Enhanced Character Checks:** We’ve implemented additional checks for
characters that might not be recognized by Excel, reducing the likelihood of
errors after exporting the workbook.
These improvements are designed to make your data export process more efficient
and tailored to your specific needs.
[Learn more](https://flatfile.com/plugins/plugin-export-workbook/) about
exporting workbooks with @flatfile/plugin-export-workbook
`@flatfile/plugin-webhook-egress@1.2.3`
The latest version introduces key improvements for enhanced functionality and
user experience:
**Switch to Fetch:** In this version, we’ve transitioned from using axios to
fetch for HTTP requests, streamlining the library’s usage across different
environments. Browser Compatibility: The plugin is now fully compatible with
browser environments, extending its utility beyond server-side applications.
**Enhanced Response Rejection Handling:** The integration of our response
rejection utility allows for sophisticated post-webhook action decisions.
Following a webhook egress, the utility analyzes the response to determine
whether to delete records that were successfully submitted or to mark them with
a status column reflecting their outcome. It also allows specifying custom error
messages for fields within records that encountered rejections.
This update aims to offer more flexibility, reliability, and broader
applicability of the webhook egress plugin, ensuring seamless integration into
your data workflow.
[Learn more in the docs](https://flatfile.com/plugins/plugin-webhook-egress/)
`@flatfile/plugin-autocast@0.7.6`
0.7.6 introduces string casting capabilities to the autocast plugin. With this
new feature, numbers and booleans can now be automatically converted to strings,
making it easier to manage data types across your applications.
This enhancement is particularly useful in conjunction with the rawNumbers
property found in `@flatfile/plugin-xlsx-extractor`, where numeric values
extracted from documents can be seamlessly transformed into string fields for
consistency and ease of use.
[Learn more](https://flatfile.com/plugins/plugin-autocast)
`@flatfile/plugin-delimiter-extractor@0.9.1` &
`@flatfile/plugin-xlsx-extractor@v1.11.3`
New header logic has been added to the @flatfile/plugin-delimiter-extractor and
@flatfile/plugin-xlsx-extractor. This update brings our extractors in line with
Flatfile’s core header detection capabilities, offering three distinct header
detection strategies: `default`, `explicitHeaders`, and `specificRows`.
By employing the default strategy, the extractors now automatically scan the
first 10 rows of your file to identify the header row based on the highest count
of non-empty cells.
[Discover more](https://flatfile.com/plugins/plugin-xlsx-extractor) about these
enhanced header detection options and how they can streamline your data import
processes.
`@flatfile/plugin-xlsx-extractor@1.11.3`
This release brings essential updates and improvements to enhance your
experience with Excel file extractions:
**File Size Detection:** The plugin now actively detects when a file is too
large for extraction, guiding users to opt for CSV uploads for better handling.
**Date Format Specification:** Introducing the dateNF parameter, enabling users
to define specific date formats for Excel file parsing (e.g., yyyy-mm-dd),
ensuring that dates are correctly recognized and formatted.
**Handling Empty Sheets:** Previously, empty sheets within Excel files would
result in errors. With this update, the plugin gracefully handles such cases by
appropriately failing the job without causing unexpected errors.
These improvements are aimed at providing a more robust, user-friendly
experience for working with Excel files, ensuring data integrity and easing the
extraction process.
[Learn more about the improvements](https://flatfile.com/plugins/plugin-xlsx-extractor)
**🚀 Introducing @flatfile/plugin-convert-yaml-schema 🚀**
`@flatfile/plugin-convert-yaml-schema` automates the process of converting YAML
into the Flatfile Blueprint.
To get started, simply provide a sourceUrl of your schema, and wrap it with the
plugin:
```jsx
listener.use(
configureSpaceWithYamlSchema([{ sourceUrl: "https://example.com/yaml" }]),
);
```
Additionally, you can enhance the experience with a few optional parameters:
* `options.workbookConfig`: The options.workbookConfig parameter allows you to
incorporate other optional Workbook configurations seamlessly.
* `options.debug`: Toggle the options.debug parameter on/off to access helpful
debugging messages for development purposes.
* `callback`: The callback parameter, which receives three arguments—event,
workbookIds, and a tick function—empowers you with the ability to update the
Job’s progress. Note: This callback function is invoked once the Space and
Workbooks are fully configured.
With the `@flatfile/plugin-convert-yaml-schema`, you can simplify the schema
conversion process—all in one powerful package.
Check out the [docs](https://flatfile.com/plugins/plugin-convert-yaml-schema).
**🚀 Introducing @flatfile/plugin-convert-json-schema 🚀**
`@flatfile/plugin-convert-json-schema` automates the process of converting JSON
Schema into the Flatfile Blueprint.
To get started, simply provide a sourceUrl of your schema, and wrap it with the
plugin:
```jsx
listener.use(
configureSpaceWithJsonSchema([
{ sourceUrl: "https://example.com/customer.schema.json" },
]),
);
```
Additionally, you can enhance the experience with a few optional parameters:
* `options.workbookConfig`: The options.workbookConfig parameter allows you to
incorporate other optional Workbook configurations seamlessly.
* `options.debug`: Toggle the options.debug parameter on/off to access helpful
debugging messages for development purposes.
* `callback`: The callback parameter, which receives three arguments—event,
workbookIds, and a tick function—empowers you with the ability to update the
Job’s progress. Note: This callback function is invoked once the Space and
Workbooks are fully configured.
With the `@flatfile/plugin-convert-json-schema`, you can simplify the schema
conversion process—all in one powerful package.
Check out the [docs](https://flatfile.com/plugins/plugin-convert-json-schema).
`@flatfile/plugin-json-extractor@0.1.1`
The JSON extractor now intelligently flattens nested objects, combining their
field keys into a single, structured format until it reaches a primitive type,
making data extraction more comprehensive and user-friendly.
`@flatfile/plugin-record-hook@1.1.11`
In a recent update, we introduced a comparison mechanism to track changes
between original and modified values, ensuring that records were only updated
when both value and messages had changed. However, this approach unintentionally
excluded updates to metadata when no changes were detected in value or messages.
In this version, we've refined the process by comparing the entire original
record object with the modified record object. This ensures that no updates,
including metadata changes, are left unaccounted for, regardless of whether
there were changes in the value or message fields. Your records are now more
comprehensively managed and updated.
`@flatfile/plugin-record-hook@1.1.10`
Previously, there was an issue with completing commits while using the
`@flatfile/api` client-side, which led to the error message "Cannot read
properties of undefined (reading 'get')." To address this problem, we've made a
swift and effective transition to using fetch. Soon, we plan a more
comprehensive solution.
`@flatfile/plugin-record-hook@1.1.9`
* Previously, an issue arose when attempting to complete a commit with the
`trackChanges` flag (a Workbook setting to disable actions on Sheet and
Workbook when there is a commit that has not been completed) disabled on the
Workbook. This issue has been resolved, and commits are now exclusively
triggered when the `trackChanges` feature is enabled for the Workbook.
* Additionally, we've implemented an advanced deep comparison method between the
original record and the potentially modified record. This enhancement
guarantees that only essential patches for updated records are transmitted,
resulting in a notably more efficient and precise data update process.
`@flatfile/plugin-export-workbook@0.0.8`
When exporting a workbook, a job completion message is now available with an
added feature. You can now include a "Next" URL within the job completion
message, providing a seamless transition to the next step in your workflow.
`@flatfile/plugin-export-workbook@0.0.7`
In this update, we've made several valuable improvements to enhance your
experience:
**Export Record ID Option**: You can now utilize an optional flag to export the
Record ID, adding flexibility and precision to your exports.
**Column Pattern Generator Fix**: We've addressed and fixed issues related to
the column pattern generator, ensuring accurate and reliable data generation.
**Column Count per Sheet**: We're now passing the column count to the generator
for each sheet, optimizing data handling for your specific needs.
**Improved Error Handling**: We've refined error handling to ensure that jobs
are properly marked as failed in case of errors, providing better transparency
and control.
**File Cleanup**: To keep your Space clean and efficient, files are now
automatically deleted after they have been successfully uploaded, promoting tidy
data management.
These updates collectively contribute to a smoother and more efficient workflow
for your tasks.
`@flatfile/plugin-extractor-xlsx@1.8.0`
In this version, we've added a valuable enhancement to the Excel Extractor by
exposing SheetJS's "raw" option. This option empowers users to preserve
formatted text by default. When `raw` is set to true, the extractor will return
the raw value, offering greater flexibility and control over extracted data.
`@flatfile/plugin-automap@0.1.1`
Previously, initiating the "import" process, whether by dropping a file into a
sheet or clicking the "Import" button from a file, created an unintended
secondary `workbook:map` job. This secondary job was detected by the automap
plugin, leading to duplicate entries in a sheet.
In this version, we've introduced a solution to address this issue. We've added
an "isAutomap" flag to the job's input, which allows the automap plugin to
filter jobs accordingly. If the "isAutomap" flag is not provided, the automap
plugin will gracefully exit, ensuring a more streamlined and error-free
workflow.
`@flatfile/plugin-automap@0.1.1`
Previously, when initiating the "import" process, which could be triggered by
either dropping a file into a sheet or clicking the "Import" button from a file,
a secondary `workbook:map` job was created. This unintentionally triggered the
automap plugin, leading to duplicate entries within a sheet.
In response to this issue, this version introduces an `isAutomap` flag within
the job's input. The inclusion of this flag allows the automap plugin to apply
appropriate filtering. If the flag is not provided, automap will gracefully exit
without processing the job, thereby resolving the issue of duplicate entries in
the sheet.
`@flatfile/plugin-automap@0.1.0`
Previously, the Automap plugin exclusively matched the `defaultTargetSheet` with
the `sheet.name` or `sheet.id`. In this update, we have expanded the matching
capability to include the `sheet.slug`. This enhancement provides greater
flexibility and precision when configuring default target sheets within the
Automap plugin. This improvement is especially beneficial when you need to pass
something like the sheet name + filename dynamically, making the mapping process
even more versatile.
`@flatfile/util-extractor@0.4.5`
`util-extractor` is a crucial dependency used in all extractor plugins. We've
made an important fix on the import method across all extractor plugins. If you
previously encountered the following error message, you can now resolve this
issue by updating to the latest extractor version.
```ssh
Module not found: Error: Can't resolve '@flatfile/api/api'
```
`@flatfile/plugin-space-configure@0.1.4`
An improvement was made to allow for configuring Spaces with no Workbooks.
**Major (Agent) Plugin-based Extraction Speed Improvements Just Went Live**
We noticed the record insert portion of extraction taking far longer in plugins
than in the core platform (with CSVs). We had to get to the bottom of why there
was such a disparity and we’re thrilled to say we have a solution. **We’re
seeing a 700k file that once took between 10-12 mins now only takes 1-1.5 min!**
But wait, there’s more:
Additionally, end users will now get updated percentages during the upload
process and will receive a success message when the file is successfully
extracted.
Upgrade your extractor(s) to the latest to enjoy this optimization:
* `@flatfile/plugin-xlsx-extractor@1.7.5`
* `@flatfile/plugin-delimiter-extractor@0.7.3`
* `@flatfile/plugin-json-extractor@0.6.4`
* `@flatfile/plugin-pdf-extractor@0.0.5`
* `@flatfile/plugin-xml-extractor@0.5.4`
* `@flatfile/plugin-zip-extractor@0.3.7`
`@flatfile/plugin-psv-extractor@1.6.0` & `@flatfile/plugin-tsv-extractor@1.5.0`
We're excited to announce that PSV and TSV file types, previously reliant on
plugins, are now natively supported by the Flatfile Platform! 🚀
As part of this enhancement, we've marked these plugins as deprecated.
Developers will receive a friendly console log notification, making it clear
that these plugins are no longer needed. Enjoy the streamlined experience!
🚀 **Introducing `@flatfile/util-response-rejection`**

Meet `@flatfile/util-response-rejection`, a new utility for showcasing rejected
Records *from an external API* to your customers. Managing rejected data during
egress is vital for maintaining data accuracy, and this utility simplifies the
entire process, ensuring a smoother experience for handling these instances.
Here's what it does:
1. Takes a `RejectionResponse` containing rejected Records and a rejection
message.
2. Locates the corresponding Record and adds the rejection message as an error
to the Record cell.
You can also utilize this utility directly with any listener.
Learn more in the [docs](https://flatfile.com/plugins/util-response-rejection/).
`@flatfile/plugin-extractor-___`
All extractors have been fine-tuned to seamlessly handle extractions within a
job, allowing the plugin more time to complete the extraction with less risk of
the Agent timing out.
Additionally, we've resolved a bug that was causing extractions to falsely
indicate completion when running in parallel, ensuring extraction truly finishes
before signaling completion.
`@flatfile/plugin-autocast@0.2.2`
Dates that were cast to a UTC string using the autocast plugin were showing as
invalid after transformation. A fix for this was added to version 0.2.2.
[Learn more](https://flatfile.com/plugins/plugin-autocast).
`@flatfile/plugin-autocast`
In the most recent update, we’ve introduced some exciting enhancements. You can
now implement an optional `fieldFilter` to specify which fields autocast should
operate on.
Check it out:
```jsx
listener.use(autocast({ sheetSlug: "bar" }, ["numberField", "dateField"]));
```
[Learn more](https://flatfile.com/plugins/plugin-autocast).
🚀 **Introducing `@flatfile/plugin-autocast`**
Effortlessly transform data in your Sheets to align with the field types
specified in the Blueprint.
Supported field types:
* **Numbers:** String numbers ('1'), string decimals ('1.1'), and string numbers
with commas ('1,000') are interpreted as numbers.
* **Booleans:**
* Truthy values: '1', 'yes', 'true', 'on', ' ', 'y', and 1.
* Falsy values: '- ', '0', 'no', 'false', 'off', ' ', 'n', 0, -1.
* **Dates:** Date strings and numbers are cast to a UTC string (note:
`YYYY-MM-DD...` is interpreted as an ISO 8601 date and is treated as treated
as UTC, while other formats are treated as local time and converted to UTC).
* '2023-08-16' => 'Wed, 16 Aug 2023 00:00:00 GMT'
* '08-16-2023' => 'Wed, 16 Aug 2023 00:00:00 GMT'
* '08/16/2023' => 'Wed, 16 Aug 2023 00:00:00 GMT'
* 'Aug 16, 2023' => 'Wed, 16 Aug 2023 00:00:00 GMT'
* 'August 16, 2023' => 'Wed, 16 Aug 2023 00:00:00 GMT'
* '2023-08-16T00:00:00.000Z' => 'Wed, 16 Aug 2023 00:00:00 GMT'
* 1692144000000 => 'Wed, 16 Aug 2023 00:00:00 GMT'
Note: **@flatfile/plugin-record-hook** listens for the same event type
(commit:created). Plugins will fire in the order they are placed in the
listener.
Check out the [docs](https://flatfile.com/plugins/plugin-autocast).
🚀 **Introducing `@flatfile/plugin-jobs-handler`**
Our latest plugin, `@flatfile/plugin-jobs-handler`, streamlines handling
Flatfile Jobs, which are a large unit of work performed asynchronously on a
resource such as a file, Workbook, or Sheet.
**Options at your fingertips:**
* Update Job progress using `await tick(progress, message)`, returning a promise
for `JobResponse`.
* `opts.debug` Enable debug logging for the plugin
To get started simplifying the management of your Jobs, explore the
[README](https://flatfile.com/plugins/plugin-job-handler/).
🚀 **Introducing `@flatfile/plugin-space-configure`**
Streamline the dynamic setup of new Flatfile Spaces with
`@flatfile/plugin-space-configure`.
**How it works:**
* The `setup` parameter holds the Blueprint for the new Space.
* And the `callback` parameter (invoked once the Space and Workbooks are fully
configured) receives three arguments:
1. `event`
2. `workbookIds`
3. Using the `@flatfile/plugin-jobs-handler` under the hood, the `tick` function
can be used to update the Job’s progress.
To simplify auto-configuring your Spaces, explore the
[README](https://flatfile.com/plugins/plugin-space-configure/).
**All extractors now support `chunkSize` and `parallel`**
A new version of an underlying utility (`@flatfile/util-extractor`) introduces 2
new options for extracting records in all extractor plugins:
* `chunkSize`: (Default: 3,000) Define how many records you want to process in
each batch. This allows you to balance efficiency and resource utilization
based on your specific use case.
* `parallel`: (Default: 1) Choose whether the records should be processed in
parallel. This enables you to optimize the execution time when dealing with
large datasets.
*Note: Previously, we were extracting with a chunkSize of 1,000.*
**Ex: Excel Usage: (See
[docs](https://flatfile.com/plugins/plugin-xlsx-extractor))**
```ts
listener.use(ExcelExtractor({ chunkSize: 300, parallel: 2 }));
```
If you update your extractor plugin to the latest, you will receive these new
options.
`@flatfile/plugin-delimiter-extractor`
Now that the platform includes native support for TSV and PSV files, developers
are no longer required to use a plugin specifically for these formats. As a
result of this enhancement, the documentation for the
`@flatfile/plugin-delimiter-extractor` has been revised to reflect this update.
For users who are already utilizing or have integrated a plugin for TSV and PSV
files, there's no need to worry about any disruptions. While the extraction will
occur twice, resulting in a "extraction complete" status being displayed twice,
the process remains functional and intact.
**🚀 Introducing `@flatfile/plugin-pdf-extractor`**
Our latest plugin, @flatfile/plugin-pdf-extractor, introduces the power of
parsing .pdf files in Flatfile.
Note: A subscription to pdftables.com is required.
Options at your fingertips:
* `opt.apiKey`: Feed in your pdftables.com API key to unlock the magic.
* `opt.debug`: Toggle debugging messages to streamline development.
Tech Behind the Scenes:
* Empowered by `remeda` for dynamic functional programming and data handling.
* Seamlessly integrates Pattern Matching with TypeScript through `ts-pattern`.
[See the docs](https://flatfile.com/plugins/plugin-pdf-extractor)
`@flatfile/plugin-dedupe@0.0.2`
Includes fixes to properly job ack on failure and use a new instance of listener
after a filter operation. It also adds server errors to logging.
`@flatfile/util-file-buffer@0.0.3`
A fix was made to only run `fileBuffer` on uploaded files. This fixes an issue
where extraction was occurring during export improperly.
All extractor plugins went up one tick to leverage this update to the file
buffer.
The extractor most affected was `xlsx-extractor` as there’s a correlating
plugin for exporting to xlsx.
**🚀 Introducing `@flatfile/@flatfile/plugin-dedupe`**
@flatfile/plugin-dedupe adds a touch of magic by seamlessly removing duplicate
records right within a sheet with several options to fit your use case:
* `opt.keep`: Decide whether to hang on to the first or last duplicate record.
* `opt.custom`: Craft your own dedupe function, for those out-of-the-box
scenarios.
* `opt.debug`: Toggle on those helpful debug messages when you’re in the lab.
Tech:
* Powered by `ts-pattern` for in-depth Pattern Matching in TypeScript.
* Leverages the mighty `remeda` for JavaScript’s functional programming and data
wizardry.
[See the docs](https://flatfile.com/plugins/plugin-dedupe)
**🚀 Introducing `@flatfile/plugin-delimiter-extractor`**
Introducing the latest addition to our extractor
plugins: @flatfile/plugin-delimiter-extractor. Designed to streamline your data
extraction tasks, this plugin is tailored to process delimited files including
tab (\t), pipe (|), semicolon (;), colon (:), tilde (\~), caret (^), and hash
(#).
Parameters and Options:
* `fileExt` specifies the file name or extension to listen for, allowing you to
define the file types to process.
* `options.delimiter` the delimiter character used in the file.
* `options.dynamicTyping` automatically convert numeric and boolean data in the
file to their appropriate data types, ensuring accurate processing.
* `options.skipEmptyLines`: With ‘true’, completely empty lines (evaluating to
an empty string) will be skipped during parsing. With ‘greedy’, lines with
only whitespace characters are also skipped.
* `options.transform `define a function to be applied to each parsed value
before dynamicTyping.
[See the docs](https://flatfile.com/plugins/plugin-delimiter-extractor)
`@flatfile/plugin-xlsx-extractor@1.4.0`
Includes a fix for ghost rows in Excel files (happened if there was formatting
on a cell but no data)
`@flatfile/plugin-delimiter-extractor@0.4.0` &
`@flatfile/plugin-xlsx-extractor@1.5.0`
Adds ability to support duplicate headers with non-unique header keys.
`@flatfile/plugin-delimiter-extractor@0.3.0` &
`@flatfile/plugin-xlsx-extractor@1.3.2`
Adds header row auto-detection (the same function used for CSVs in platform)
`@flatfile/plugin-zip-extractor@0.3.2`
Includes a fix to exclude unwanted dir (MACOSX) and it now checks file name not
full path
# @flatfile/angular
Source: https://flatfile.com/docs/changelog/sdks/wrappers/angular
An SDK wrapper that enables you to seamlessly integrate a secure and user-friendly data import experience into your client-side application
loading...
# @flatfile/javascript
Source: https://flatfile.com/docs/changelog/sdks/wrappers/javascript
An SDK wrapper that enables you to seamlessly integrate a secure and user-friendly data import experience into your client-side application
loading...
# @flatfile/react
Source: https://flatfile.com/docs/changelog/sdks/wrappers/react
An SDK wrapper that enables you to seamlessly integrate a secure and user-friendly data import experience into your client-side application
loading...
# @flatfile/vue
Source: https://flatfile.com/docs/changelog/sdks/wrappers/vue
An SDK wrapper that enables you to seamlessly integrate a secure and user-friendly data import experience into your client-side application
loading...
# Polyglot API
Source: https://flatfile.com/docs/documentation/api
choose your language
Our [API SDK](https://reference.flatfile.com/overview/welcome) is available in
multiple languages, making it easy to integrate Flatfile regardless of your tech
stack. Each library gives you the full power of our platform in patterns you'll
feel right at home with.
A JavaScript client for Flatfile.
A Go client for Flatfile.
A Python client for Flatfile.
A Java client for Flatfile.
Coming Soon.
### Setup
After installing the package, import the `FlatfileClient` and instantiate it by
passing in your secret key as the token.
```ts
import { FlatfileClient } from "@flatfile/api";
const token = "sk_your_secret_key";
const api = new FlatfileClient({ token });
```
Make sure to keep your secret key safe, as it can be used to access
potentially sensitive information.
### Making API Calls
Make API calls by calling the appropriate method on the `api` object. Note
that responses are promises, so you'll need to handle them accordingly.
## Core API Methods
```ts
await api.actions.create({
spaceId: "us_sp_your_space_id",
workbookId: "us_wb_your_workbook_id",
actionConfig: { ... },
})
```
\< API Reference />
```ts
await api.actions.get({
actionId: "us_ac_your_action_id"
})
```
\< API Reference />
```ts
await api.actions.list({
spaceId: "us_sp_your_space_id",
workbookId: "us_wb_your_workbook_id",
filter: { ... },
page: 1,
pageSize: 10
})
```
\< API Reference />
```ts
await api.actions.cancel({
actionId: "us_ac_your_action_id"
})
```
\< API Reference />
```ts
await api.actions.retry({
actionId: "us_ac_your_action_id"
})
```
\< API Reference />
```ts
await api.agents.create({
environmentId: "us_env_your_environment_id",
config: { ... }
})
```
\< API Reference />
```ts
await api.agents.get({
agentId: "us_ag_your_agent_id"
})
```
\< API Reference />
```ts
await api.agents.update({
agentId: "us_ag_your_agent_id",
config: { ... }
})
```
\< API Reference />
```ts
await api.agents.delete({
agentId: "us_ag_your_agent_id"
})
```
\< API Reference />
```ts
await api.agents.list({
environmentId: "us_env_your_environment_id",
filter: { ... }
})
```
\< API Reference />
```ts
await api.agents.deploy({
agentId: "us_ag_your_agent_id"
})
```
\< API Reference />
```ts
await api.agents.logs({
agentId: "us_ag_your_agent_id",
filter: { ... }
})
```
\< API Reference />
```ts
await api.documents.create({
spaceId: "us_sp_your_space_id",
content: { ... }
})
```
\< API Reference />
```ts
await api.documents.get({
documentId: "us_doc_your_document_id"
})
```
\< API Reference />
```ts
await api.documents.update({
documentId: "us_doc_your_document_id",
content: { ... }
})
```
\< API Reference />
```ts
await api.documents.delete({
documentId: "us_doc_your_document_id"
})
```
\< API Reference />
```ts
await api.documents.list({
spaceId: "us_sp_your_space_id",
filter: { ... }
})
```
\< API Reference />
```ts
await api.documents.share({
documentId: "us_doc_your_document_id",
recipients: ["user@example.com"]
})
```
\< API Reference />
```ts
await api.environments.create({
name: "Production",
config: { ... }
})
```
\< API Reference />
```ts
await api.environments.get({
environmentId: "us_env_your_environment_id"
})
```
\< API Reference />
```ts
await api.environments.update({
environmentId: "us_env_your_environment_id",
config: { ... }
})
```
\< API Reference />
```ts
await api.environments.delete({
environmentId: "us_env_your_environment_id"
})
```
\< API Reference />
```ts
await api.environments.list()
```
\< API Reference />
```ts
await api.environments.getSecrets({
environmentId: "us_env_your_environment_id"
})
```
\< API Reference />
```ts
await api.environments.updateSecrets({
environmentId: "us_env_your_environment_id",
secrets: { ... }
})
```
\< API Reference />
```ts
await api.files.upload({
data: fileData,
config: { ... }
})
```
\< API Reference />
```ts
await api.files.download({
fileId: "us_fl_your_file_id"
})
```
\< API Reference />
```ts
await api.files.get({
fileId: "us_fl_your_file_id"
})
```
\< API Reference />
```ts
await api.files.delete({
fileId: "us_fl_your_file_id"
})
```
\< API Reference />
```ts
await api.files.list({
filter: { ... }
})
```
\< API Reference />
```ts
await api.files.validate({
fileId: "us_fl_your_file_id"
})
```
\< API Reference />
```ts
await api.jobs.create({
config: { ... }
})
```
\< API Reference />
```ts
await api.jobs.get({
jobId: "us_job_your_job_id"
})
```
\< API Reference />
```ts
await api.jobs.update({
jobId: "us_job_your_job_id",
status: "completed"
})
```
\< API Reference />
```ts
await api.jobs.cancel({
jobId: "us_job_your_job_id"
})
```
\< API Reference />
```ts
await api.jobs.list({
filter: { ... }
})
```
\< API Reference />
```ts
await api.jobs.logs({
jobId: "us_job_your_job_id"
})
```
\< API Reference />
```ts
await api.records.create({
sheetId: "us_sh_your_sheet_id",
records: [ ... ]
})
```
\< API Reference />
```ts
await api.records.get({
recordId: "us_rec_your_record_id"
})
```
\< API Reference />
```ts
await api.records.update({
recordId: "us_rec_your_record_id",
data: { ... }
})
```
\< API Reference />
```ts
await api.records.delete({
recordId: "us_rec_your_record_id"
})
```
\< API Reference />
```ts
await api.records.list({
sheetId: "us_sh_your_sheet_id",
filter: { ... }
})
```
\< API Reference />
```ts
await api.records.search({
sheetId: "us_sh_your_sheet_id",
query: "search term"
})
```
\< API Reference />
```ts
await api.records.validate({
records: [ ... ]
})
```
\< API Reference />
```ts
await api.records.bulk.create({
sheetId: "us_sh_your_sheet_id",
records: [ ... ]
})
```
\< API Reference />
```ts
await api.records.bulk.update({
sheetId: "us_sh_your_sheet_id",
records: [ ... ]
})
```
\< API Reference />
```ts
await api.records.bulk.delete({
sheetId: "us_sh_your_sheet_id",
recordIds: ["us_rec_id1", "us_rec_id2"]
})
```
\< API Reference />
```ts
await api.sheets.create({
workbookId: "us_wb_your_workbook_id",
config: { ... }
})
```
\< API Reference />
```ts
await api.sheets.get({
sheetId: "us_sh_your_sheet_id"
})
```
\< API Reference />
```ts
await api.sheets.update({
sheetId: "us_sh_your_sheet_id",
config: { ... }
})
```
\< API Reference />
```ts
await api.sheets.delete({
sheetId: "us_sh_your_sheet_id"
})
```
\< API Reference />
```ts
await api.sheets.list({
workbookId: "us_wb_your_workbook_id"
})
```
\< API Reference />
```ts
await api.sheets.configure({
sheetId: "us_sh_your_sheet_id",
config: { ... }
})
```
\< API Reference />
```ts
await api.sheets.getSchema({
sheetId: "us_sh_your_sheet_id"
})
```
\< API Reference />
```ts
await api.sheets.updateSchema({
sheetId: "us_sh_your_sheet_id",
schema: { ... }
})
```
\< API Reference />
```ts
await api.sheets.commit({
sheetId: "us_sh_your_sheet_id"
})
```
\< API Reference />
```ts
await api.spaces.create({
config: { ... }
})
```
\< API Reference />
```ts
await api.spaces.get({
spaceId: "us_sp_your_space_id"
})
```
\< API Reference />
```ts
await api.spaces.update({
spaceId: "us_sp_your_space_id",
config: { ... }
})
```
\< API Reference />
```ts
await api.spaces.delete({
spaceId: "us_sp_your_space_id"
})
```
\< API Reference />
```ts
await api.spaces.list()
```
\< API Reference />
```ts
await api.spaces.getMetadata({
spaceId: "us_sp_your_space_id"
})
```
\< API Reference />
```ts
await api.spaces.updateMetadata({
spaceId: "us_sp_your_space_id",
metadata: { ... }
})
```
\< API Reference />
```ts
await api.spaces.getMembers({
spaceId: "us_sp_your_space_id"
})
```
\< API Reference />
```ts
await api.spaces.addMember({
spaceId: "us_sp_your_space_id",
userId: "us_usr_your_user_id",
role: "admin"
})
```
\< API Reference />
```ts
await api.spaces.removeMember({
spaceId: "us_sp_your_space_id",
userId: "us_usr_your_user_id"
})
```
\< API Reference />
```ts
await api.users.get({
userId: "us_usr_your_user_id"
})
```
\< API Reference />
```ts
await api.users.update({
userId: "us_usr_your_user_id",
data: { ... }
})
```
\< API Reference />
```ts
await api.users.delete({
userId: "us_usr_your_user_id"
})
```
\< API Reference />
```ts
await api.users.list({
filter: { ... }
})
```
\< API Reference />
```ts
await api.users.invite({
email: "user@example.com",
role: "member"
})
```
\< API Reference />
```ts
await api.users.getApiKeys({
userId: "us_usr_your_user_id"
})
```
\< API Reference />
```ts
await api.users.createApiKey({
userId: "us_usr_your_user_id",
config: { ... }
})
```
\< API Reference />
```ts
await api.users.revokeApiKey({
keyId: "us_key_your_key_id"
})
```
\< API Reference />
```ts
await api.workbooks.create({
spaceId: "us_sp_your_space_id",
config: { ... }
})
```
\< API Reference />
```ts
await api.workbooks.get({
workbookId: "us_wb_your_workbook_id"
})
```
\< API Reference />
```ts
await api.workbooks.update({
workbookId: "us_wb_your_workbook_id",
config: { ... }
})
```
\< API Reference />
```ts
await api.workbooks.delete({
workbookId: "us_wb_your_workbook_id"
})
```
\< API Reference />
```ts
await api.workbooks.list({
spaceId: "us_sp_your_space_id"
})
```
\< API Reference />
```ts
await api.workbooks.configure({
workbookId: "us_wb_your_workbook_id",
config: { ... }
})
```
\< API Reference />
```ts
await api.workbooks.getSchema({
workbookId: "us_wb_your_workbook_id"
})
```
\< API Reference />
```ts
await api.workbooks.updateSchema({
workbookId: "us_wb_your_workbook_id",
schema: { ... }
})
```
\< API Reference />
```ts
await api.workbooks.commit({
workbookId: "us_wb_your_workbook_id"
})
```
\< API Reference />
# Personal Access Tokens
Source: https://flatfile.com/docs/documentation/authentication/account-token
Create and manage user-scoped tokens for API authentication
## Overview
Personal Access Tokens (PATs) provide a secure way to authenticate with the Flatfile API. Unlike environment-specific API keys, PATs are user-scoped tokens that inherit the permissions of the user who created them.
Personal Access Tokens:
* Are user-scoped authentication tokens
* Have the same auth scope as the user who created them
* Can be used in place of a JWT for API authentication
* Are ideal for scripts, automation, and integrations that need to act on behalf of a user
This opens up possibilities for various use cases, including building audit logs, managing Spaces, and monitoring agents across environments.
## Managing Personal Access Tokens
### Creating a Token
1. Log in to your Flatfile account
2. Click on your user profile dropdown in the top-right corner
3. Select "Personal Access Tokens"
4. Click "Create Token"
5. Enter a descriptive name for your token
6. Copy the generated token immediately - it will only be shown once
Make sure to copy your token when it's first created. For security reasons, you won't be able to view the token again after leaving the page.
### Retrieving a Personal Access Token (Legacy Method)
Your `publishableKey` and `secretKey` are specific to an environment. Therefore, to interact at a higher level, you can use a personal access token.
1. From the dashboard, open **Settings**
2. Click to **Personal Tokens**
3. Retrieve your `clientId` and `secret`.
4. Using the key pair, call the auth endpoint:
```bash
curl -X POST https://platform.flatfile.com/api/v1/auth -H 'Content-Type: application/json' -d '{"clientId":"1234-1234", "secret":"1234-1234"}'
```
5. The response will include an `accessToken`. Present that as your **Bearer `token`** in place of the `secretKey`.
### Using a Token
Use your Personal Access Token in API requests by including it in the Authorization header.
### Listing Tokens
You can view all your active tokens in the Personal Access Tokens page. For each token, you can see:
* Name
* Creation date
* Last used date (if applicable)
### Deleting Tokens
To delete a token:
1. Navigate to the Personal Access Tokens page
2. Find the token you want to delete
3. Click the menu icon (three dots) next to the token
4. Select "Delete"
5. Confirm the deletion
Deleting a token immediately revokes access for any applications or scripts using it. Make sure you update any dependent systems before deleting a token.
## Retrieving Account-Level Objects
With the obtained Personal Access Token, you can use it as a Bearer token in place of the `secretKey` to perform account-level queries. This allows you access to every object across the Account.
By using the personal access token, you can query for all events across all environments, list all users and guests, determine the number of Spaces and their presence in different environments, and identify environments with agents deployed.
## Example Use Cases
### Building an Audit Log
Query for all events across all environments and combine them with user and guest data to create a comprehensive audit log, providing a detailed history of actions within the application.
### Managing Spaces Across Environments
Determine the number of Spaces available and identify which Spaces exist in different environments, allowing you to efficiently manage and organize your data.
### Monitoring Agents Across Environments
Keep track of agents deployed to various environments by retrieving information about their presence, ensuring smooth and efficient data import processes.
## Best Practices
* Create separate tokens for different applications or use cases
* Use descriptive names that identify where the token will be used
* Regularly review and delete unused tokens
* Rotate tokens periodically for enhanced security
* Never share your tokens with others - each user should create their own tokens
## API Reference
For programmatic management of Personal Access Tokens, see the [Personal Access Tokens API Reference](/api-reference/auth/personal-access-tokens).
# API Keys
Source: https://flatfile.com/docs/documentation/authentication/authentication
use API keys to authenticate API requests
API keys are created automatically. Use the [Developers
Page](https://platform.flatfile.com/developers) to reveal your API keys for
each Environment.
Flatfile provides two different kinds of environment-specific API keys you can
use to interact with the API. In addition, you can work with a development key
or a production environment.
Read on to learn more about how to think about which keys to use and how to test
in development mode.
## Testing and development
{/* prettier-ignore */}
[Environments](/learning-center/architecture/environments) are isolated entities and are intended to be a safe place to create and
test different configurations. A `development` and `production` environment are created
by default.
| isProd | Name | Description |
| ------- | ------------- | ------------------------------------------------------------------------------------------- |
| *false* | `development` | Use this default environment, and its associated test API keys, as you build with Flatfile. |
| *true* | `production` | When you're ready to launch, create a new environment and swap out your keys. |
The development environment does not count towards your paid credits.
## Secret and publishable keys
All Accounts have two key types for each environment. Learn when to use each
type of key:
| Type | Id | Description |
| --------------- | ---------------------- | ----------------------------------------------------------------------------------------------------------------------- |
| Secret key | `sk_23ghsyuyshs7dcrty` | **On the server-side:** Store this securely in your server-side code. Don’t expose this key in an application. |
| Publishable key | `pk_23ghsyuyshs7dcert` | **On the client-side:** Can be publicly-accessible in your application's client-side code. Use when embedding Flatfile. |
The `accessToken` provided from `publishableKey` will remain valid for a
duration of 24 hours.
# Roles & Permissions
Source: https://flatfile.com/docs/documentation/authentication/roles-and-permissions
grant your team and customers access
## Administrator Roles
Administrator roles have full access to your accounts, including inviting
additional admins and seeing developer keys.
The `accessToken` provided will remain valid for a duration of 24 hours.
| Role | Details |
| ------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Administrator | This role is meant for any member of your team who requires full access to the Account.
✓ Can add other administrators ✓ Can view secret keys ✓ Can view logs |
## Guest Roles
Guest roles receive access via a magic link or a shared link depending on the
[Environment](https://platform.flatfile.com/dashboard) `guestAuthentication`
type. Guests roles can invite other Guests unless you turn off this setting in
the [Guest Sidebar](/learning-center/guides/guest_sidebar).
The `accessToken` provided will remain valid for a duration of 1 hour.
### Space Grant
| Role | Details |
| ------------------ | -------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Single-Space Guest | This role is meant for a guest who has access to only one Space. Such guests can be invited to additional Spaces at any time. |
| Multi-Space Guest | This role is meant for a guest who has access to multiple Spaces. They will see a drop-down next to the Space name that enables them to switch between Spaces. |
### Workbook Grant
| Role | Details |
| --------------------- | ------------------------------------------------------------------------------------------ |
| Single-Workbook Guest | This role is meant for a guest who should have access to only one Workbook within a Space. |
| Multi-Workbook Guest | This role is intended for a guest who has access to multiple Workbooks within a Space. |
This role can only be configured using code. See code example.
```js
const createGuest = await api.guests.create({
environmentId: "us_env_hVXkXs0b",
email: "guest@example.com",
name: "Mr. Guest",
spaces: [
{
id: "us_sp_DrdXetPN",
workbooks: [
{
id: "us_wb_qGZbKwDW",
},
],
},
],
});
```
### Guest Lifecycle
When a guest user is deleted, all their space connections are automatically removed to ensure security. This means:
* The guest loses access to all previously connected spaces
* They cannot regain access to these spaces without being explicitly re-invited
This automatic cleanup ensures that deleted guests cannot retain any access to spaces, even if they are later recreated with the same email address.
# Managing Agents
Source: https://flatfile.com/docs/documentation/core-libraries/cli/agent-management
manage your agents with the Flatfile CLI
## Listing Agents
To view all agents in your environment, use the following command:
```bash
npx flatfile agents list
```
## Downloading Agents
To `download`command allows you to download a copy of the agent's source code to your local machine. This is useful when you need to:
* Examine the code of a deployed agent
* Make modifications to an existing agent
* Back up your agent code
* Debug issues with a deployed agent
To download an agent, use the following command:
```bash
npx flatfile agents download
```
Use the `list` command to get the slug or id of the agent you want to manage.
## Delete An Agent
The `delete` command allows you to remove a deployed agent from your Flatfile environment. This is useful when you no longer need an agent or want to clean up your environment.
To delete an agent, use the following command:
`npx flatfile delete `
### Options
| Option | Description |
| -------------------- | ------------------------------- |
| `--slug` or `-s` | The slug of the agent to delete |
| `--agentId` or `-ag` | The ID of the agent to delete |
# Deploying
Source: https://flatfile.com/docs/documentation/core-libraries/cli/deploy
npx flatfile deploy
For a production deployment, listeners can be hosted either in your cloud or
directly on the Flatfile platform.
## Flatfile-Hosted Agents (Recommended)
While your own cloud provides the maximum level of control over listener code,
qualities such as availability and network performance are the responsibility of
the hosting provider. Flatfile offers tooling to deploy and manage Listeners securely hosted
and run in the Flatfile cloud giving you a more consistent, low-latency, high-availability
experience.
### Deploying An Agent
Once you have configured your a local configuration running, bundling and deploying
your listener to the flatfile cloud can be done in a single command.
```
npx flatfile deploy
```
The CLI will attempt to locate the entrypoint of your listener in the following order:
* `./index.js,`
* `./index.ts`
* `./src/index.js`
* `./src/index.ts`
If your listener is in another location you can provide the path as an argument:
```
npx flatfile deploy ./path-to/listener.ts
```
The CLI will provide updates as it's deploying:
```terminal
> npx flatfile deploy
✔ Code package compiled to .flatfile/build.js
✔ Code package passed validation
✔ Environment "production" selected
✔ Event listener deployed and running on your environment "production". us_ag_1234
```
Set your target environment `FLATFILE_ENVIRONMENT_ID` and API token `FLATFILE_API_KEY` in your .env file to avoid entering them when prompted.
### Dashboard Management
You can view and manage agents via the Flatfile Dashboard. The dashboard provides a simple interface for monitoring and controlling your deployed agents with the following capabilities:
| Feature | Description |
| --------------------- | --------------------------------------------------------------------------------------------------- |
| View Agent | Access details about your deployed agents including their latest deployment and last received event |
| View Logs | Monitor agent activity and troubleshoot issues by examining execution logs |
| Download Agent Code | Retrieve the deployed code for reference, backup, or refactoring purposes |
| Delete Agent | Remove agents that are no longer needed from your environment |
| Revert Agent | Roll back to a previous version if issues arise with newer deployments |
| Deploy Library Agents | Deploy pre-built agents from the Flatfile Agent Library |
### Re-Deploying
To update the code in your listener simply make the update your desire re-deploy.
Agents are versioned and you may revert to a previous version at any time via the dashboard.
### Multiple Agents
To deploy a second Agent without overwriting the first, specify a unique slug:
```terminal
npx flatfile deploy -s pink
```
When you pass a slug, the CLI will create a new Agent with the specified
slug. This allows you to deploy multiple Agents to the same environment.
To update an existing Agent, you can specify the slug of the Agent you want to
update by running the same command, including the slug.
If you do not specify a slug and have only one or no deployed Agents the CLI
will update your existing Agent, or create your first Agent. The slug for this
agent will be `default`.
## Hosting on Regional Servers
To deploy on a regional server, please contact our support team.
Regional servers are available upon request for those needing to host their
applications closer to their user base, ensuring faster access and compliance
with local data regulations.
### URL References
| Region | URL / SPACE\_URL | API URL |
| ------ | ------------------------ | ---------------------------- |
| UK | platform.uk.flatfile.com | platform.uk.flatfile.com/api |
| EU | platform.eu.flatfile.com | platform.eu.flatfile.com/api |
| AU | platform.au.flatfile.com | platform.au.flatfile.com/api |
| CA | platform.ca.flatfile.com | platform.ca.flatfile.com/api |
### Configuring API URL
When deploying, you can specify the `FLATFILE_API_URL` either in your project's
`.env` file or as an environment variable.
```bash .env
FLATFILE_API_URL=platform.eu.flatfile.com
```
### Embedding Via Space URL
To embed Flatfile in your application, include both apiUrl and spaceUrl in your
FlatfileImporter configuration to specify your regional server:
```typescript
const spaceProps: ISpace = {
name: "Embedded Space",
publishableKey: "pk_**********",
apiUrl: "Regional API URL here",
spaceUrl: "Regional Space URL here",
workbook,
listener,
// Additional properties here
};
```
### Configure API Client Environment
For direct API interactions, the FlatfileClient needs the environment parameter
set to your selected regional API URL:
```ts
import { FlatfileClient } from "@flatfile/api";
const api = new FlatfileClient({
environment: "Regional API URL here",
});
```
## Self-Hosting
Hosting listener code in your own cloud works similarly to how we run the
`develop` command in that:
1. The listener process is launched
2. It polls the Flatfile API for updates
3. It then responds to those Events accordingly
Reach out to support for learning how about hosting in your own cloud.
## Related Commands
* [develop](/documentation/core-libraries/cli/develop) - Run a local listener for development
* [list](/documentation/core-libraries/cli/agent-management#listing-agents) - List all deployed agents in your environment
* [download-agent](/documentation/core-libraries/cli/agent-management#downloading-agents) - Download an agent from your environment
* [delete](/documentation/core-libraries/cli/agent-management#delete-an-agent) - Delete an agent from your environment
# Developing
Source: https://flatfile.com/docs/documentation/core-libraries/cli/develop
npx flatfile develop
The Flatfile CLI provides a suite of tools to help you develop and deploy
[Listeners](/learning-center/concepts/listeners) to Flatfile.
Listeners are at the core of the Flatfile ecosystem, they are responsible for
picking up on [Events](/learning-center/concepts/events) and responding to them
accordingly.
Agents are listeners deployed server-side on Flatfile's secure cloud.
When developing with Flatfile, listeners can [deployed](/documentation/core-libraries/cli/deploy) to
Flatfile on every change, or they can run directly on your machine.
### Commands
The Flatfile CLI provides the following commands:
| Command | Description |
| -------- | --------------------------------------- |
| develop | Run your project as a local listener |
| deploy | Deploy your project as a Flatfile Agent |
| list | List deployed Agents in an environment |
| download | Download an agent from your environment |
| delete | Delete an Agent |
### Before You Begin
You can add a `.env` file to bypass having to enter your API Key when running
the develop command.
```json
FLATFILE_API_KEY=sk_123456
```
### Develop
Run `npx flatfile develop` from terminal to launch a local listener. As your
code changes, file changes are detected and will be reflected immediately.
Add the file name to the command: `npx flatfile develop path-to-file` if
you're not running an index file.
1. In your terminal, you'll see a detailed log of each HTTP request made within
a listener. It includes the request timing and status, enabling tracking and
debugging of HTTP operations.
2. A pub/sub stream system will deliver Events with a latency of approximately
10-50ms. This provides a local development experience closely resembling the
production environment speed.
3. Events that are being processed will be shown, along with their respective
handlers. You can then respond to those Events accordingly.
```bash
> npx flatfile develop
✔ 1 environment(s) found for these credentials
✔ Environment "development" selected
ncc: Version 0.36.1
ncc: Compiling file index.js into CJS
✓ 427ms GET 200 https://platform.flatfile.com/api/v1/subscription 12345
File change detected. 🚀
✓ Connected to event stream for scope us_env_1234
▶ commit:created 10:13:05.159 AM us_evt_1234
↳ on(**, {})
↳ on(commit:created, {"sheetSlug":"contacts"})
```
## Shared Environments
It is highly recommended to utilize an isolated environment when developing with
the local listener. That's why the CLI is warning you when working in an
environment that already has a deployed agent.
Having deployed code running at the same time within the same development
environment can lead to interference, potentially resulting in a recursive loop
and undesirable consequences for your local listener.
To avoid such complications, it's advisable to keep your local environment
separate from any deployed code. This ensures optimal performance and helps
maintain a stable development environment.
# CLI
Source: https://flatfile.com/docs/documentation/core-libraries/cli/overview
Flatfile Command Line Interface
The Flatfile Command Line Interface (CLI) provides a set of tools to help you develop, deploy, and manage [Listeners](/learning-center/concepts/listeners) in your Flatfile environment.
Once listeners deployed are and hosted on Flatfile's secure cloud, they are called Agents.
## Installation
The Flatfile CLI is available as an npm package and can be invoked with npx.
```
npx flatfile@latest
```
## Authentication
To interact with the agents in your environment, you must authenticate with your Flatfile API key. You may do so in two ways:
1. Pass your credentials with your command via the the `--token` and `--env` flags
2. Making the `FLATFILE_API_KEY` and `FLATFILE_ENVIRONMENT_ID` variables available in your environment, via `.env` file in your project directory or other configurations
## Available Commands
| Command | Description |
| --------------------------------------------------------------------------------- | ---------------------------------------------------- |
| [develop](/documentation/core-libraries/cli/develop) | Run your project as a local listener for development |
| [deploy](/documentation/core-libraries/cli/deploy) | Deploy your project as a Flatfile Agent |
| [list](/documentation/core-libraries/cli/agent-management#listing-agents) | List all deployed Agents in your environment |
| [download](/documentation/core-libraries/cli/agent-management#downloading-agents) | Download an agent from your environment |
| [delete](/documentation/core-libraries/cli/agent-management#delete-an-agent) | Delete an Agent from your environment |
## Development Workflow
A typical development workflow with the Flatfile CLI looks like this:
Configure your listener in local development using the develop command.
This command runs your listener locally.
It will respond to events in your configured environment just as it would once deployed,
but changes to your listener will be watched so you can test changes in real-time.
When ready, use the deploy command to deploy your listener as an Agent.
To facilitate easy management, name your listener via the `--slug/-s` flag.
Use the list command to view deployed Agents and the delete command to remove them when needed.
Use the download to retrieve the code of a deployed Agent for inspection or modification.
## Environment Isolation
It's recommended to use an isolated environment for development to avoid conflicts with deployed Agents. The CLI will warn you if you're working in an environment that already has deployed Agents.
## Next Steps
* Learn more about [Listeners](/learning-center/concepts/listeners)
* Understand [Events](/learning-center/concepts/events) in Flatfile
* Explore [Workbooks](/learning-center/architecture/workbooks) and [Spaces](/learning-center/architecture/spaces)
# Listener
Source: https://flatfile.com/docs/documentation/core-libraries/listener
@flatfile/listener
The Flatfile Listener is a core building block of the Flatfile platform.
Listening and responding to events is what enables developers to build powerful
integrations with Flatfile. This package enables you to handle all the events in
your data pipeline — from the moment data arrives to its final transformation.
Think of it as your event command center.
### Basic Usage
```typescript
import { Listener } from "@flatfile/listener";
export default (listener: Listener) => {
listener.on("**", (event) => {
console.log(event);
});
};
```
## Core Components
Let's explore the building blocks that make the Listener powerful.
### Working with Records
These classes give you full control over your data, letting you read, modify,
and validate records with ease.
### Managing Sessions
Keep tabs on everything happening in your workspace — from active uploads to
schema changes.
### Event Handling
Catch and respond to any action in your data pipeline. Whether it's new data
arriving or a validation completing, you're in control.
### Authentication
Handle API calls securely with built-in authentication and header management.
# Plugins
Source: https://flatfile.com/docs/documentation/plugins
extend your data engine with Plugins
## Extractors
A plugin for parsing .delimiter files.
A plugin for parsing json files.
A plugin for parsing markdown files.
A plugin for parsing PDF files.
A plugin for parsing xlsx files.
A plugin for parsing .xml files.
A plugin for unzipping zip files and uploading content back in Flatfile.
A Flatfile plugin for extracting table data from HTML files.
A plugin for parsing markdown files.
## Transform
A plugin for automatically casting value.
A plugin for extending blueprint with external constraints.
Dedupe records in a sheet via a sheet level custom action.
A plugin for running custom logic on individual data record.
A Flatfile plugin for normalizing date formats.
A Flatfile Listener plugin for number validation.
A validator plugin for phone number formatting on individual data records.
## Automation
A plugin to provide automapping imported files for headless workflows.
## Export
A Flatfile plugin for exporting Workbooks to delimited files and zipping them.
A Flatfile plugin for generating pivot tables from sheet data and saving as...
A plugin for exporting data in Flatfile to Workbooks.
A plugin for egressing data from a Flatfile Workbook to a webhook.
## Schemas
A plugin for converting JSON Schema to Flatfile Blueprint and configuring
a...
A plugin for converting OpenAPI schema to Flatfile Blueprint.
A plugin for converting SQL DDL into Flatfile Blueprint.
A plugin for converting YAML Schema definitions to Flatfile Blueprint.
## Migrations
A plugin for using DXP class-based configurations.
## Core
A plugin for handling Flatfile Jobs.
A plugin for automatically rolling out new changes to workbook...
A plugin for configuring a Flatfile Space.
A plugin for configuring a Flatfile Space from a Space Template.
A plugin for making the view post mapping show only mapped columns.
## Utils
A library containing common utilities and helpers for extractors.
A utility for extracting data from any file and making it available as a
buffer.
This plugin handles response rejections returned from an external source.
## Currency
A Flatfile plugin for currency conversion using Open Exchange Rates API.
## Convert
A Flatfile Listener plugin for field translation using the Google
Translate...
A Flatfile plugin for converting What3Words addresses to standard
addresses...
## Enrich
A Flatfile plugin for geocoding addresses using the Google Maps Geocoding A...
A Flatfile plugin for parsing GPX files and extracting relevant data.
A Flatfile plugin for sentiment analysis of text fields in records.
A Flatfile plugin for text summarization and key phrase extraction.
## Import
A Flatfile plugin that generates example records using Faker.
A Flatfile plugin that generates example records using AI.
A Flatfile plugin for importing RSS feed data.
## Validate
A Flatfile plugin for boolean validation with multi-language support.
A Flatfile Listener plugin for email validation.
A Flatfile Listener plugin for ISBN validation with configurable options.
A Flatfile plugin for string configuration and validation.
# Embed Flatfile
Source: https://flatfile.com/docs/documentation/sdks/angular/new_space
create a new Space every time Flatfile is opened
For synchronous data import/exchange completed in one session, create a new
Space each time Flatfile is opened. This suits situations where a clean slate
for every interaction is preferred.
## Before you begin
Follow prompts from the `new` command.
```bash
ng new create-flatfile-angular-embed
```
```bash
cd create-flatfile-angular-embed
```
```bash
npm i @flatfile/angular-sdk @flatfile/plugin-record-hook @flatfile/listener
```
### Build your importer
### 1. Initialize Flatfile
In your `app.component.ts` you'll need to pass in a minimum of the
`publishableKey`.
```TypeScript src/app/app.component.ts
import { Component } from '@angular/core';
import { CommonModule } from '@angular/common';
import { RouterOutlet } from '@angular/router';
import { SpaceModule, SpaceService, ISpace } from '@flatfile/angular-sdk';
@Component({
selector: 'app-root',
standalone: true,
imports: [CommonModule, RouterOutlet, SpaceModule],
templateUrl: './app.component.html',
styleUrls: ['./app.component.css']
})
export class AppComponent {
title = 'create-flatfile-angular-embed';
showSpace: boolean = false;
constructor(private spaceService: SpaceService) {}
toggleSpace() {
this.spaceService.OpenEmbed();
this.showSpace = !this.showSpace;
}
closeSpace() {
this.showSpace = false;
}
spaceProps: ISpace = {
name: 'my space!',
publishableKey: 'pk_1234',
closeSpace: {
operation: 'submitActionFg',
onClose: this.closeSpace.bind(this),
},
displayAsModal: true,
}
}
```
```html src/app/app.component.html
```
### 2. Start your client
Now, start your front end by heading to the terminal and running the following
command.
```bash
npm run start
```
### 3. Build a workbook
Now, let’s build a Workbook inside the Space for next time.
Add your `config.ts` file, and place the following code in it. After you have
done so, import the configuration to `app.component.ts`, and update `spaceProps`
to have the value of `config` under `workbook`. This config file has the
configuration settings for your workbook, so feel free to edit however necessary
to meet your needs.
```TypeScript src/config.ts
import { Flatfile } from "@flatfile/api";
export const config: Pick<
Flatfile.CreateWorkbookConfig,
"name" | "sheets" | "actions"
> = {
name: "Employees workbook",
sheets: [
{
name: "TestSheet",
slug: "TestSheet",
fields: [
{
key: "first_name",
type: "string",
label: "First name",
constraints: [
{
type: "required",
},
],
},
{
key: "last_name",
type: "string",
label: "last name",
},
{
key: "email",
type: "string",
label: "Email",
},
],
actions: [
{
label: "Join fields",
operation: "TestSheet:join-fields",
description: "Would you like to join fields?",
mode: "foreground",
confirm: true,
},
],
},
],
actions: [
{
label: "Submit",
operation: "TestSheet:submit",
description: "Would you like to submit your workbook?",
mode: "foreground",
primary: true,
confirm: true,
},
],
};
```
```TypeScript src/app/app.component.ts
spaceProps: ISpace = {
name: 'my space!',
publishableKey: 'pk_1234',
closeSpace: {
operation: 'submitActionFg',
onClose: this.closeSpace.bind(this),
},
displayAsModal: true,
workbook: config,
};
```
### 4. Transform Data
Next, we'll listen for data changes and respond using an event listener.
1. Add a `src/listeners/listener.ts` file with this simple `recordHook`.
2. Update `app.component.ts` to import the listener.
Once you add this code, when a change occurs, we'll log the entered first name
and update the last name to "Rock." You'll immediately see this begin to work
when you add or update any records. Learn more about
[Handling Data](/learning-center/guides/handling-data)
```TypeScript src/listeners/listener.ts
import api from "@flatfile/api";
import { FlatfileListener } from "@flatfile/listener";
import { recordHook } from "@flatfile/plugin-record-hook";
/**
* Example Listener
*/
export const listener = FlatfileListener.create((listener) => {
listener.on("**", (event) => {
console.log(`Received event:`, event);
});
listener.use(
recordHook("contacts", (record) => {
const firstName = record.get("firstName");
console.log({ firstName });
record.set("lastName", "Rock");
return record;
})
);
listener.filter({ job: "workbook:submitActionFg" }, (configure) => {
configure.on("job:ready", async ({ context: { jobId } }) => {
try {
await api.jobs.ack(jobId, {
info: "Getting started.",
progress: 10,
});
// Make changes after cells in a Sheet have been updated
console.log("Make changes here when an action is clicked");
await api.jobs.complete(jobId, {
outcome: {
acknowledge: true,
message: "This is now complete.",
next: {
type: "wait",
},
},
});
} catch (error: any) {
console.error("Error:", error.stack);
await api.jobs.fail(jobId, {
outcome: {
message: "This job encountered an error.",
},
});
}
});
});
});
```
```TypeScript src/app/app.component.ts
@Component({
selector: 'app-root',
standalone: true,
imports: [CommonModule, RouterOutlet, SpaceModule],
templateUrl: './app.component.html',
styleUrl: './app.component.css'
})
export class AppComponent {
title = 'create-flatfile-angular-embed';
showSpace: boolean = false;
constructor() {}
toggleSpace() {
this.showSpace = !this.showSpace;
}
closeSpace() {
this.showSpace = false;
}
spaceProps: ISpace = {
name: 'my space!',
publishableKey: 'pk_1234',
closeSpace: {
operation: 'submitActionFg',
onClose: this.closeSpace.bind(this),
},
displayAsModal: true,
workbook: config,
listener
}
}
```
### 5. Match your brand
By attaching a `themeConfig` to `spaceProps`, we will now override colors in
your Space to match your brand. See all of the options here in the
[Theming Reference](/learning-center/guides/theming).
```typescript
themeConfig: {
root: {
primaryColor: "red",
textColor: "white",
logo: "https://images.ctfassets.net/hjneo4qi4goj/gL6Blz3kTPdZXWknuIDVx/7bb7c73d93b111ed542d2ed426b42fd5/flatfile.svg",
},
},
```
### 6. Add customizations
You can stop here or you can [view our full reference](../reference/common) to
see all the ways you can customize your importer.
## Example Project
Find this Angular example project in the Flatfile GitHub repository.
Clone the Flatfile Angular tutorial here.
# Reuseable Spaces
Source: https://flatfile.com/docs/documentation/sdks/angular/reuse_space
reuse a Space when Flatfile is opened
For applications meant to use the same space consistently, open an existing
space each time Flatfile is opened. This suits situations where consistently
editing a dataset is preferred.
## Before you begin
If you have already tried our [Embed a New Space](./new_space) guide you will
notice this guide departs heavily, so you will want to create this in a new
directory, as translation would be more difficult than creating from scratch.
Follow prompts from the `new` command.
```bash
ng new create-flatfile-angular-embed
```
```bash
cd create-flatfile-angular-embed
```
```bash
npm i
```
Install Packages
```bash
npm i @flatfile/api @flatfile/listener @flatfile/plugin-record-hook @flatfile/angular-sdk @flatfile/plugin-record-hook @flatfile/listener express && npm i --save-dev concurrently nodemon ts-node
```
#### 1. Set up Configuration Files
This app has some configuration files that must be set up before you can get
started with development. Set them up as shown below.
```JSON nodemon.json
{
"watch": ["server"],
"ext": "ts,json",
"ignore": ["src/**/*.spec.ts"],
"exec": "ts-node --esm ./server/index.ts"
}
```
You will also want to add some scripts to your `package.json` to start the app.
Add the following scripts:
```JSON pacakge.json (snippet)
"scripts": {
"dev:frontend": "ng serve",
"dev:backend": "nodemon",
"dev": "concurrently 'npx tsc --watch' 'npm:dev:frontend' 'npm:dev:backend'",
},
```
#### 2. Create the Server
Lets create the server that will act as the backend of the application. This
will be necessary to serve the pages as well as get the existing space, as due
to security reasons the Secret Key cannot be exposed to the browser at any time.
You'll want to replace the token value with your secret key or api token.
```TypeScript server/index.ts
import express from "express";
import { FlatfileClient } from "@flatfile/api";
const port = 3000;
const app = express();
app.get('/api/spaces/:id', async (_req, res)=>{
const {id} = _req.params;
const flatfile = new FlatfileClient({
token: 'secret_key',
environment: 'https://platform.flatfile.com/api/v1',
});
try {
const space = await flatfile.spaces.get(id);
res.json({ space });
} catch (error) {
console.error("Error retrieving space:", error);
res.status(500).json({ error: "Failed to retrieve space" });
}
})
app.listen(port, () => {
console.log("Server listening on port", port);
});
```
#### 3. Build your existing Space Component
The component should end up looking something like this:
```TypeScript src/app/app.component.ts
import { Component } from '@angular/core';
import { CommonModule } from '@angular/common';
import { RouterOutlet } from '@angular/router';
import { SpaceModule, ISpace } from '@flatfile/angular-sdk';
@Component({
selector: 'app-root',
standalone: true,
imports: [CommonModule, RouterOutlet, SpaceModule],
templateUrl: './app.component.html',
styleUrls: ['./app.component.css'],
})
export class AppComponent {
title = 'create-flatfile-angular';
showSpace: boolean = false;
constructor(private spaceService: SpaceService) {}
toggleSpace() {
this.spaceService.OpenEmbed();
this.showSpace = !this.showSpace;
}
closeSpace() {
this.showSpace = false;
}
spaceProps: ISpace = {
space: {
id: 'us_sp_1234',
accessToken: 'sk_1234'
},
closeSpace: {
operation: 'submitActionFg',
onClose: this.closeSpace.bind(this),
},
displayAsModal: true,
}
fetchData = async (spaceId: string) => {
const response = await fetch(`http://localhost:3000/api/spaces/${spaceId}`);
const json = await response.json();
if(json.error){
return
}
if(this.spaceProps.space) {
this.spaceProps.space.accessToken = json.space.data.accessToken;
}
}
ngOnInit() {
if(this.spaceProps.space?.id){
this.fetchData(this.spaceProps.space.id).catch(
(err) => {
console.error(err)
}
);
}
}
}
```
You'll need to update your `app.component.html` to include the existing space
component. It should end up looking like the below:
```html src/app/app.component.html
```
#### 6. Start your client
Now you should be able to start your app. To load it in dev mode and ensure
everything works proprly, run:
```bash
npm run dev
```
If you have any errors, now is your time to fix them up, else you're ready to
deploy!
#### 7. Customize
You can stop here or you can [view our full reference](../reference/common) to
see all the ways you can customize your importer.
## Example Project
Find this Vue.js example project in the Flatfile GitHub repository.
Clone the Full Flatfile Angular tutorial here.
# Embed Flatfile
Source: https://flatfile.com/docs/documentation/sdks/javascript/new_space
create a new Space every time Flatfile is opened
For synchronous data import/exchange completed in one session, create a new
Space each time Flatfile is opened. This suits situations where a clean slate
for every interaction is preferred.
## Before you begin
## Prepare your project
### Install packages
Make a new directory.
```bash
mkdir example-flatfile-js-embed
```
Go into that directory.
```bash
cd example-flatfile-js-embed
```
Follow prompts from the `init` command.
```bash
npm init
```
Install packages. (We'll use `parcel` for bundling.)
```bash
npm i @flatfile/javascript @flatfile/listener @flatfile/plugin-record-hook @flatfile/api flatfile
```
Install dev packages.
```bash
npm i parcel --save-dev
```
### Create your file structure
Setup your app to look something like this:
```
├── public/
└── index.html
└── styles.css
├── src/
└── client.js
└── workbook.js
└── listener.js
├── package.json <--- already created
└── package-lock.json <--- already created
```
In this file structure, your app should have two main directories, `public` and
`src.`
The `public` directory contains the `index.html` file, which is the entry point
of the application's front-end, and the `styles.css` file for styling the
iframe.
The `src` directory contains the main components and logic of the application,
including the `client.js` file, which initializes Flatfile and passes in
available options.
## Build your importer
### 1. Add a Flatfile button
Add a button to your application to open Flatfile in a modal. Pass in your
`publishableKey` and a new Space will be created on each page load. Also, add
the content here to your `styles.css`.
```html public/index.html (snippet)
```
```html public/index.html (full page)
Hello, world!
<Flatfile />
Embed Flatfile in just a few lines of code.
```
### 2. Initialize Flatfile
In your `client.js`, at minimum, you'll need to receive the `publishableKey` set
from when you called `openFlatfile`.
```js src/client.js
import { initializeFlatfile } from "@flatfile/javascript";
//create a new space in modal
window.openFlatfile = ({ publishableKey }) => {
if (!publishableKey) {
throw new Error(
"You must provide a publishable key - pass through in index.html",
);
}
const flatfileOptions = {
name: "Embedded Space",
publishableKey,
externalActorId: 'test-1',
sidebarConfig: {
showSidebar: false,
},
closeSpace: {
operation: "submitActionFg",
onClose: () => {
//custom code if needed
},
},
// Additional props...
};
initializeFlatfile(flatfileOptions);
};
```
### 3. Start your client
Now, start your front end by heading to terminal and running the following
command. To see that it’s running, visit: [https://localhost:1234](https://localhost:1234) (or the port it
is running on) and you should see your page and a button. Click the button and
see that an empty Space gets created.
```bash
npx parcel public/index.html
```
### 8. Customize
You can stop here or you can [view our full reference](../reference/common) to
see all the ways you can customize your importer.
## Example Project
Find this Javascript example project in the Flatfile GitHub repository.
Clone the Flatfile Javascript tutorial here.
# Reusable Spaces
Source: https://flatfile.com/docs/documentation/sdks/javascript/reuse_space
reuse a Space when Flatfile is opened
Reuse a Space when users might need to wait or can't finish in one go. It's
great for keeping work context and letting users continue where they left off
until the task is done.
## Before you begin
To reuse an existing Space, we'll update our files to pass in a Space Id instead
of a `publishableKey`. We'll then make a server-side request using our
`secretKey` to get the Space and its access token.
## Prepare your project
### Install packages
Make a new directory.
```bash
mkdir reuse-example-flatfile-js-embed
```
Go into that directory.
```bash
cd reuse-example-flatfile-js-embed
```
Follow prompts from the `init` command.
```bash
npm init
```
Install packages.
```bash
npm i @flatfile/javascript @flatfile/listener @flatfile/plugin-record-hook @flatfile/api cors dotenv express flatfile
```
Install dev packages.
```bash
npm i parcel --save-dev
```
### Create your file structure
Setup your app to look something like this:
```
├── public/
└── index.html
└── styles.css
├── src/
└── client.js
└── server.mjs
├── .env
├── package.json <--- already created
└── package-lock.json <--- already created
```
In this file structure, your app should have two main directories, `public` and
`src.`
The `public` directory contains the `index.html` file, which is the entry point
of the application's front-end, and the "style.css" file for styling the iframe.
The `src` directory contains the main components and logic of the application,
including the `client.js` file, which initializes Flatfile and passes in
available options, and the `server.mjs` file, which sets up a Node.js server
using Express that listens for incoming requests and communicates with the
Flatfile API to retrieve data about a specified Space.
### Update your .env
Update your .env. `FLATFILE_API_KEY` is your Secret Key and `SPACE_ID` is the
Space you want to open in the importer. This is can be found on your Dashboard
where it lists your Spaces. You shouldn't need to update the `BASE_URL`.
```
BASE_URL=https://platform.flatfile.com/api
FLATFILE_API_KEY=sk_1234
SPACE_ID=us_sp_1234
```
## Build your importer
### 1. Add a Flatfile button
Add a button to your application to open Flatfile in a modal. Pass in your
`publishableKey` and a new Space will be created on each page load. Also, add
the content here to your `styles.css`.
```html public/index.html (snippet)
```
```html public/index.html (full page)
Hello, world!
<Flatfile />
Embed Flatfile in just a few lines of code.
```
### 2. Create a local server
This code sets up a Node.js server using Express that listens for incoming
requests, enables CORS (Cross-Origin Resource Sharing), and communicates with
the Flatfile API to retrieve data about a specified Space based on the provided
environment variables.
```jsx src/server.mjs
import dotenv from "dotenv";
import express from "express";
import { FlatfileClient } from "@flatfile/api";
import cors from "cors"; // Import the cors module
dotenv.config();
const app = express();
const port = 8080;
console.log(process.env.SPACE_ID);
console.log(process.env.FLATFILE_API_KEY);
const flatfile = new FlatfileClient({
token: process.env.FLATFILE_API_KEY,
environment: process.env.BASE_URL + "/v1",
});
// Enable CORS middleware
app.use(cors());
app.get("/", (req, res) => {
res.send("Hello, world!");
});
app.get("/space", async (req, res) => {
try {
const space = await flatfile.spaces.get(process.env.SPACE_ID);
res.json(space);
} catch (error) {
console.error("Error retrieving space:", error);
res.status(500).json({ error: "Failed to retrieve space" });
}
});
app.listen(port, () => {
console.log(`Server is running on port ${port}`);
});
```
### 3. Start your server
Now, start your server by heading to terminal and running the following command.
To see that it’s running, visit: [https://localhost:8080](https://localhost:8080) (or the port it is
running on) and you should see a page that says "Hello, World".
```bash
npx node src/server.mjs
```
### 4. Initialize Flatfile
In your `client.js`, at minimum, you'll need to get and pass the `space` info.
This code opens an existing Space in a modal by making a request to your server
endpoint and initializing the Flatfile data import with specified options based
on the server's response.
```js src/client.js
import { initializeFlatfile } from "@flatfile/javascript";
const server_url = "http://localhost:8080";
//open existing space in modal
window.openFlatfile = () => {
fetch(server_url + "/space") // Make a request to the server endpoint
.then((response) => response.json())
.then((space) => {
const flatfileOptions = {
space: {
id: space && space.data && space.data.id,
accessToken: space && space.data && space.data.accessToken,
},
sidebarConfig: {
showSidebar: false,
},
externalActorId: 'test-1',
// Additional props...
};
initializeFlatfile(flatfileOptions);
})
.catch((error) => {
console.error("Error retrieving space in client:", error);
});
};
```
### 5. Start your client
Now, start your front end by heading to terminal, opening a new tab, and running
the following command. To see that it’s running, visit: [https://localhost:1234](https://localhost:1234)
(or the port it is running on) and you should see your page and a button. Click
the button and see that your Space loads. **That's it!**
```bash
npx parcel public/index.html
```
### 6. Customize
You can stop here or you can [view our full reference](../reference/common) to
see all the ways you can customize your importer.
## Example Project
Find this Javascript example project in the Flatfile GitHub repository.
Clone the Flatfile Javascript tutorial here.
# Components
Source: https://flatfile.com/docs/documentation/sdks/react/components
configure your Space in React
## Components
### FlatfileProvider
`FlatfileProvider` is a comprehensive React component designed for integrating
Flatfile's import capabilities into your application. It can be initialized
using either a `publishableKey` or an `accessToken`, providing flexibility
depending on your authentication flow. The component allows for extensive
customization of the embedded iFrame through styling parameters, ensuring that
the import modal matches your application's aesthetics. It maintains the
configuration necessary for creating and managing Spaces, Workbooks, and
Documents within Flatfile. Additionally, `FlatfileProvider` manages a
client-side listener to handle browser-based events, ensuring a seamless user
interaction with the import process.
Learn more about the
[FlatfileProvider](/documentation/sdks/react/components/FlatfileProvider)
```tsx
```
### Space
The `Space` component from the `@flatfile/react` package is utilized to define a
collaborative environment or workspace within Flatfile's import system. It can
be configured to create a new space or to reuse an existing one by providing a
space ID.
#### Main Props
* `config`: Sets up the configuration for a new space, including theming and
metadata.
* `id`: An optional prop that, when provided, indicates the specific existing
space to be reused instead of creating a new one.
Learn more about the [Space](/documentation/sdks/react/components/Space)
Example usage for creating a new space:
```tsx
```
When you want to re-use a space, you'll need to pass an `accessToken` to the
`FlatfileProvider`, then this is where you'll add the `id` of the Space you want
to re-use.
```tsx
const FFApp = () => ;
const App = () => {
return (
);
};
```
### Workbook
Configures all Workbook Creation request data and a new onRecordHook helper
function
`onSubmit` and a new `onRecordHooks` helper can be added to the Workbook. A
Workbook object containing the sheets can be passed along in the `config`.
`onRecordHooks` takes an array of record hooks. Each one can take a slug for
manually setting each per sheet. Otherwise, if no slug is added, it will apply
the record hook to the corresponding sheet in the Workbook index.
Learn more about the [Workbook](/documentation/sdks/react/components/Workbook)
```tsx
{
console.log("onSubmit", { sheet });
}}
onRecordHooks={[
[
"contacts",
(record) => {
record.set("email", "TEST SHEET RECORD");
return record;
},
],
]}
/>
```
### Sheet
The `Sheet` component from the `@flatfile/react` package integrates Flatfile's
data import functionality into React applications. It simplifies configuring the
data import process and managing the lifecycle of data submission and record
handling.
#### Main Props
* `config`: Defines the structure and settings of the data to be imported.
* `onSubmit`: A callback function that is triggered upon successful data
submission.
* `onRecordHook`: A function that allows for custom record manipulation during
the import process.
* `submitSettings`: Customizes the behavior of the data submission process.
The Sheet option is similar to the
[Simplified SDK Approach](documentation/sdks/reference/simple). Learn more about
the [Sheet](/documentation/sdks/react/components/Sheet)
Example usage:
```tsx
{
record.set("email", "TEST SHEET RECORD");
return record;
}}
onSubmit={(sheet) => {
console.log("onSubmit", { sheet });
}}
/>
```
### Document
Configures a Document to be added to the Space. Takes a simple
`Flatfile.DocumentConfig` as a param.
```tsx
const FFApp = () => ;
```
## React Hooks 🪝
### useFlatfile
This Hook exposes a few handy functions for integrating with the
FlatfileProvider
* `openPortal()`: Opens the iFrame. This will create the underlying Space,
Workbook and configure if necessary or open the Space provided.
* `closePortal()`: Closes the iFrame.
* `open`: current open status
* `listener`: Current listener
* `setListener()`: manually sets the listener
### useListener
This Hook exposes and adds any logic to the current listener with conditional
dependencies
```tsx
useListener(
(listener) => {
listener.on("**", (event) => {
console.log("initialListener Event => ", event.topic);
// Handle the workbook:deleted event
});
},
[label],
);
```
### usePlugin
This Hook exposes and adds a plugin to the current listener with conditional
dependencies
```tsx
usePlugin(
recordHook("contacts", (record, event) => {
console.log("recordHook", { event });
record.set("lastName", label);
return record;
}),
[label],
);
```
### useEvent
This Hook exposes any event coming from the iFrame to respond to with proper
filtering
```tsx
useEvent("workbook:created", (event) => {
console.log("workbook:created", { event });
});
useEvent("*:created", (event) => {
console.log({ topic: event.topic });
});
useEvent("job:ready", { job: "sheet:submitActionFg" }, async (event) => {
const { jobId } = event.context;
try {
await api.jobs.ack(jobId, {
info: "Getting started.",
progress: 10,
});
// Make changes after cells in a Sheet have been updated
console.log("Make changes here when an action is clicked");
const records = await event.data;
console.log({ records });
await api.jobs.complete(jobId, {
outcome: {
message: "This is now complete.",
},
});
// Probably a bad idea to close the portal here but just as an example
await sleep(3000);
closePortal();
} catch (error: any) {
console.error("Error:", error.stack);
await api.jobs.fail(jobId, {
outcome: {
message: "This job encountered an error.",
},
});
}
});
```
## Full Example
```tsx
import { FlatfileProvider, Sheet, Space, Workbook, DocumentConfig } from "@flatfile/react";
import { useFlatfile, useListener, usePlugin, useEvent } from "@flatfile/react";
import { recordHook } from "@flatfile/plugin-record-hook";
import { workbook } from "./workbook";
import { listener as importedListener } from './listener'
const FFApp = () => {
const { open, openPortal, closePortal } = useFlatfile()
const [lastName, setLastName] = useState('Rock')
const togglePortal = () => {
open ? closePortal() : openPortal()
}
useListener(
(listener) => {
listener.on('**', (event) => {
console.log('initialListener Event => ', event.topic)
// Handle the workbook:deleted event
})
importedListener
},
[lastName]
)
// import file directly
useListener(importedListener, [])
usePlugin(
recordHook('contacts', (record, event) => {
console.log('recordHook', { event })
record.set('lastName', label)
return record
}),
[lastName]
)
useEvent('workbook:created', (event) => {
console.log('workbook:created', { event })
})
useEvent('*:created', (event) => {
console.log({ topic: event.topic })
})
useEvent('job:ready', { job: 'sheet:submitActionFg' }, async (event) => {
const { jobId } = event.context
try {
await api.jobs.ack(jobId, {
info: 'Getting started.',
progress: 10,
})
// Make changes after cells in a Sheet have been updated
console.log('Make changes here when an action is clicked')
const records = await event.data
console.log({ records })
await api.jobs.complete(jobId, {
outcome: {
message: 'This is now complete.',
},
})
// Probably a bad idea to close the portal here but just as an example
await sleep(3000)
closePortal()
} catch (error: any) {
console.error('Error:', error.stack)
await api.jobs.fail(jobId, {
outcome: {
message: 'This job encountered an error.',
},
})
}
})
return (
)
}
const App = () => {
return (
)
}
export default App
```
# Legacy
Source: https://flatfile.com/docs/documentation/sdks/react/legacy
create a new Space every time Flatfile is opened
For synchronous data import/exchange completed in one session, create a new
Space each time Flatfile is opened. This suits situations where a clean slate
for every interaction is preferred.
## Before you begin
## Prepare your project
### Install packages
Make a new directory.
```bash
mkdir example-flatfile-react-embed
```
Go into that directory.
```bash
cd example-flatfile-react-embed
```
Follow prompts from the `init` command.
```bash
npm init
```
Install packages.
```bash
npm i @flatfile/react @flatfile/listener @flatfile/plugin-record-hook @flatfile/api flatfile react react-dom react-scripts
```
### Create your file structure
Setup your app to look something like this:
```
├── public/
└── index.html
└── styles.css
├── src/
└── App.tsx
└── index.tsx
└── workbook.ts (wait to add this)
└── listener.ts (wait to add this)
├── tsconfig.json
├── package.json <--- already created
└── package-lock.json <--- already created
```
In this file structure, your app should have two main directories, `public` and
`src.`
The `public` directory contains the `index.html` file, which is the entry point
of the application's front-end, and the "style.css" file for styling the iframe.
The `src` directory contains the main components and logic of the application,
including the `App.tsx` file, which initializes Flatfile and passes in available
options.
## Build your importer
### 1. Add a Flatfile button
Add a button to your application to open Flatfile in a modal. Pass in your
`publishableKey` and a new Space will be created on each page load. Also, add
the content here to your `styles.css`.
```html public/index.html (full page)
React App
```
```css public/styles.css
/* Styles for home page */
html,
body {
height: 100%;
margin: 0;
padding: 0;
font-family: sans-serif;
background: #090b2b;
color: #fff;
}
#root {
display: flex;
align-items: center;
justify-content: center;
height: 100%;
}
.content {
/* Adjust the width and height as needed */
width: 800px;
height: 200px;
padding: 0 20px 0 50px;
}
.contrast {
padding: 10px;
margin: 10px 5px 0 0;
border-radius: 4px;
border: 0;
font-weight: 600;
cursor: pointer;
}
/* End of styles for home page */
:root {
--ff-primary-color: #4c48ef !important;
--ff-secondary-color: #616a7d !important;
--ff-text-color: #090b2b !important;
--ff-dialog-border-radius: 4px !important;
--ff-border-radius: 5px !important;
--ff-bg-fade: rgba(0, 0, 0, 0.2) !important;
}
/* The default mount element */
/* #flatfile_iFrameContainer {
} */
/* A div around the iframe that contains Flatfile */
/* .flatfile_iframe-wrapper {
} */
/* The actual iframe that contains Flatfile */
/* #flatfile_iframe {
} */
.flatfile-close-button {
display: none !important;
}
/* Begin style overrides for when Flatfile is displayed as a modal */
/* This class gets appended to the flatfile_iframe-wrapper div */
.flatfile_displayAsModal {
padding: 50px !important;
width: calc(100% - 100px) !important;
height: calc(100vh - 100px) !important;
}
.flatfile_iframe-wrapper.flatfile_displayAsModal {
background: var(--ff-bg-fade) !important;
}
/* The close button in top right to close modal */
.flatfile_displayAsModal .flatfile-close-button {
display: block !important;
margin: 20px !important;
}
/* The icon for the close button in top right to close modal */
.flatfile_displayAsModal .flatfile-close-button svg {
fill: var(--ff-secondary-color) !important;
}
/* The actual iframe that contains Flatfile */
.flatfile_displayAsModal #flatfile_iframe {
border-radius: var(--ff-border-radius);
}
/* Begin style overrides for when you cancel out of the Flatfile modal */
/* The outer container of the modal that opens when you cancel out of Flatfile */
.flatfile_outer-shell {
background-color: var(--ff-bg-fade) !important;
border-radius: var(--ff-border-radius) !important;
}
/* The inner container of the modal that opens when you cancel out of Flatfile */
/* .flatfile_inner-shell {
} */
/* The white box inside the modal that opens when you cancel out of Flatfile */
.flatfile_modal {
border-radius: var(--ff-dialog-border-radius) !important;
}
/* The container for the buttons you see in the close modal */
/* .flatfile_button-group {
} */
/* Style the buttons you see in the close modal */
/* .flatfile_button {
} */
/* The "yes, cancel" button you see in the close modal */
.flatfile_primary {
border: 1px solid var(--ff-primary-color) !important;
background-color: var(--ff-primary-color) !important;
color: #fff;
}
/* The "no, stay" button you see in the close modal */
.flatfile_secondary {
color: var(--ff-secondary-color) !important;
}
/* The heading text you see in the close modal */
.flatfile_modal-heading {
color: var(--ff-text-color) !important;
}
/* The description text you see in the close modal */
.flatfile_modal-text {
color: var(--ff-secondary-color) !important;
}
/* End style overrides for when you cancel out of the Flatfile modal */
/* End style overrides for when Flatfile is displayed as a modal */
/* The container of the error component */
/* .ff_error_container {
}*/
/* The heading text you see in the error component */
/* .ff_error_heading {
}*/
/* The description text you see in the error component */
/* .ff_error_text {
}*/
```
### 2. Initialize Flatfile
In your `App.tsx`, at minimum, you'll need to pass in the `publishableKey`.
Also, add the content here to your `index.tsx`, `tsconfig.json`,
`manifest.json`, and `config-overries.js`.
```tsx src/App.tsx
import { Dispatch, SetStateAction, useState } from "react";
import { ISpace, initializeFlatfile } from "@flatfile/react";
import { workbook } from "./workbook";
import { listener } from "./listeners/simple";
export default function App() {
const spaceProps: ISpace = {
name: "Embedded Space",
publishableKey: "pk_1234",
};
const [showSpace, setShowSpace] = useState(false);
const { Space, OpenEmbed } = initializeFlatfile({
...spaceProps,
sidebarConfig: {
showSidebar: false,
},
closeSpace: {
operation: "submitActionFg",
onClose: () => setShowSpace(false),
},
});
const onOpenSpace = async () => {
setShowSpace(!showSpace);
await OpenEmbed();
};
return (
<Flatfile />
Embed Flatfile in just a few lines of code.
{/*Button to trigger the modal */}
{showSpace && }
);
}
```
```js src/index.tsx
import React from "react";
import ReactDOM from "react-dom";
import App from "./App";
const rootElement = document.getElementById("root")!;
ReactDOM.render(
,
rootElement
);
```
```json tsconfig.json
{
"compilerOptions": {
"target": "es5",
"lib": ["dom", "dom.iterable", "esnext"],
"allowJs": true,
"skipLibCheck": true,
"esModuleInterop": true,
"allowSyntheticDefaultImports": true,
"strict": true,
"forceConsistentCasingInFileNames": true,
"module": "esnext",
"moduleResolution": "node",
"resolveJsonModule": true,
"isolatedModules": true,
"noEmit": true,
"jsx": "react-jsx",
"noUnusedLocals": true,
"noUnusedParameters": true,
"noImplicitReturns": true,
"noFallthroughCasesInSwitch": true
},
"include": ["src"]
}
```
```json public/manifest.json
{
"short_name": "React App",
"name": "Create React App Sample",
"start_url": ".",
"display": "standalone",
"theme_color": "#000000",
"background_color": "#ffffff"
}
```
```js public/config-overrides.js
const webpack = require("webpack");
module.exports = function override(config) {
const fallback = config.resolve.fallback || {};
Object.assign(fallback, {
stream: require.resolve("stream-browserify"),
assert: require.resolve("assert"),
zlib: require.resolve("browserify-zlib"),
});
config.resolve.fallback = fallback;
config.plugins = (config.plugins || []).concat([
new webpack.ProvidePlugin({
process: "process/browser",
Buffer: ["buffer", "Buffer"],
}),
]);
config.ignoreWarnings = [/Failed to parse source map/];
config.module.rules.push({
test: /\.(js|mjs|jsx)$/,
enforce: "pre",
loader: require.resolve("source-map-loader"),
resolve: {
fullySpecified: false,
},
});
return config;
};
```
### 3. Start your client
1. Update your package.json to include this script:
```json
"scripts": {
"start": "react-app-rewired start",
"build": "react-app-rewired build",
"test": "react-app-rewired test",
}
```
2. Now, start your front end by heading to terminal and running the following
command.
```bash
npm run start
```
You'll get an alert: We're unable to detect target browsers. Would you like to
add the defaults to your package.json? Say yes.
3. To see that it’s running, visit: [https://localhost:3000](https://localhost:3000) (or the port it is
running on) and you should see your page and a button. Click the button and
see that an empty Space gets created.
### 8. Add customizations
You can stop here or you can [view our full reference](../reference/common) to
see all the ways you can customize your importer.
## Example Project
Find this React example project in the Flatfile GitHub repository.
Clone the Flatfile React tutorial here.
# Advanced
Source: https://flatfile.com/docs/documentation/sdks/reference/advanced
less commonly used embedded properties
## Reusing Spaces
During the configuration of embedded Flatfile, you have the flexibility to
either reuse an existing Space or generate a new one. These properties come into
play when you're working with an already established Space.
### space
Using your secret key, you'll make a request to a server endpoint to retrieve
an `accessToken`.
```jsx src/client.js
const flatfileOptions = {
publishableKey,
space: { id: string, accessToken: string },
//additional props
};
```
## Overrides
### mountElement
The element Flatfile will mount to. (Will create if it doesn't already exist.)
```typescript src/client.js
const flatfileOptions = {
publishableKey,
mountElement: "flatfile_hello",
//additional props
};
```
### loading
A default loading state when the Space is loading. Optionally, you can
override the default Loading component.
```typescript
const LoadingComponent = () => ;
const flatfileOptions = {
//additional props
loading: ,
};
```
### exitTitle
The title on the dialog that appears when exiting out of Flatfile.
```jsx src/client.js
const flatfileOptions = {
publishableKey,
exitTitle: "",
//additional props
};
```
### exitText
The text on the dialog that appears when exiting out of Flatfile.
Default: "Are you sure you would like to close this window? This will end your
current data import session."
```jsx src/client.js
const flatfileOptions = {
publishableKey,
exitText: "",
//additional props
};
```
### exitPrimaryButtonText
The text on the dialog primary button that appears when exiting out of
Flatfile.
```jsx src/client.js
const flatfileOptions = {
publishableKey,
exitPrimaryButtonText: "",
//additional props
};
```
### exitSecondaryButtonText
The text on the dialog secondary button that appears when exiting out of
Flatfile.
```jsx src/client.js
const flatfileOptions = {
publishableKey,
exitSecondaryButtonText: "",
//additional props
};
```
### errorTitle
The title on the dialog that appears when an unexpected error occurs when
loading Flatfile. Note: the error dialog will also include error details
regarding the error that occurred.
```jsx src/client.js
const flatfileOptions = {
publishableKey,
errorTitle: "",
//additional props
};
```
### iframe styles
Theming within the Flatfile application is done in an event listener.
The CSS here is for styling how you want things to look outside of the Flatfile
iframe, like the exit dialog, the error container, and the dialog wrapper. This
css can go in your `style.css` in your `public` folder. Remember to use
`!important` to override values.
## On Premises
For customers hosting on-premises, the following variables apply.
### apiUrl
The endpoint used to interact with the Flatfile API
```jsx src/client.js
const flatfileOptions = {
publishableKey,
apiUrl: "",
//additional props
};
```
### spaceUrl
The URL for accessing the Flatfile Spaces API.
```jsx src/client.js
const flatfileOptions = {
publishableKey,
spaceUrl: "",
//additional props
};
```
# Common
Source: https://flatfile.com/docs/documentation/sdks/reference/common
commonly used embedded properties
All wrapper SDKs take the following shared properties. It's recommended to start
with the defaults in the Guides, then come back here to make customizations.
## Authenticate
### publishableKey
Publishable key accessed via Flatfile dashboard > Developer settings.
```typescript src/client.js
const flatfileOptions = {
publishableKey,
//additional props
};
```
## Identify
While setting up embedded Flatfile, you have the option to either establish a
fresh Space during each initialization or reuse an existing Space. These
properties specifically apply when creating a new Space.
### userInfo
Additional metadata to be passed to the space as it's created
```typescript src/client.js
const flatfileOptions = {
publishableKey,
userInfo: {
userId: "string",
name: "string",
companyId: "string",
companyName: "string",
},
};
```
### externalActorId
An optional unique identifier that enables our embedded users to be
associated with a specific actor in flatfile, enabling repeated through the
embedded experience.
```typescript src/client.js
const flatfileOptions = {
publishableKey,
config: {
externalActorId: "test-1",
},
};
```
## Look & Feel
### name
Name of the space
```typescript src/client.js
const flatfileOptions = {
name: "MySpace",
//additional props
};
```
### themeConfig
Theme values for the Space, sidebar and data table.
Theme your Space to create a custom look and feel to match your brand using the
actual CSS variables referenced in the app. See all options in our
[Theming Reference](/learning-center/guides/theming).
```typescript src/client.js
const theme = {
root: {
primaryColor: "red",
},
// see theming reference
};
const flatfileOptions = {
publishableKey,
themeConfig: theme,
//additional props
};
```
### spaceBody
Pass in space options to configure a new Space. Find all available parameters
for spaceBody in the [API
Reference](https://reference.flatfile.com/api-reference/spaces/create).
```typescript src/client.js
const flatfileOptions = {
publishableKey,
//additional props, see Create Space endpoint for all the full list of properties (https://reference.flatfile.com/api-reference/spaces/create)
spaceBody: {
name: "New Space",
namespace: "Red",
},
};
```
### sidebarConfig
Sidebar config values to toggle UI elements
Within the sidebar, you can set the default page, hide or show the sidebar, and
hide or show the data checklist.
If multiple values are provided for defaultPage, it with prioritize in the
following order: Sheet, Workbook, Document, Checklist.
```typescript sidebarConfig
const mySidebarConfig = {
showSidebar: false,
showDataChecklist: false,
defaultPage: {
workbook: {
workbookId: "123",
sheetId: "123",
},
},
};
const flatfileOptions = {
sidebarConfig: mySidebarConfig,
publishableKey,
workbook,
};
```
| Property | Type | Description |
| ------------------- | ---------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `defaultPage` | `{ documentId?: string, workbook?: { workbookId: string, sheetId?: string }, checklist?: boolean }` | Landing page upon loading a Space. Defaults to primary Workbook. |
| `showDataChecklist` | `boolean` | Toggle data config for Space. Defaults to false. |
| `showSidebar` | `boolean` | Determines if a guest can see the sidebar. If the sidebar is hidden and there are multiple Workbooks in your Space, please specify the desired Workbook ID as your defaultPage. |
### document
Document to pass to space
[Documents](/learning-center/guides/documents), written in Markdown, provide extra
clarification and instructions to users of a Space. Each Document will appear in
the sidebar of a Space. Standard syntax is supported.
```typescript src/client.js
const mainDocument = {
title: "Getting Started",
body:
"\n" +
"\\\n" +
" \n" +
"\n" +
"---\n" +
"\n" +
"# Welcome to the Surf Shop!\n" +
"\n" +
"Please upload your contacts to the Surf Shop using the Files menu on the left.\n",
};
const flatfileOptions = {
publishableKey,
document: mainDocument,
//additional props
};
```
## Client-Side Server
These options are available when configuring your Space client-side.
### environmentId
Identifier for environment. This is optional with creating a Workbook
client-side. The key includes environment-specific configurations.
```typescript src/client.js
const flatfileOptions = {
publishableKey,
environmentId,
//additional props
};
```
### workbook
Config of the workbook to be created
```typescript src/client.js
const myWorkbook = {
// see Blueprint
};
const flatfileOptions = {
publishableKey,
workbook: myWorkbook,
//additional props
};
```
### listener
Event listener for responding to Events.
```typescript src/client.js
const myListener = {
// listener code
};
const flatfileOptions = {
publishableKey,
listener: myListener,
//additional props
};
```
## Configuration
These properties allow you to configure how the importer is mounted, displayed,
and closed.
### closeSpace
Options for when to close the iframe and a callback function when the session
is complete.
```typescript src/client.js
const flatfileOptions = {
publishableKey,
//additional props
closeSpace: {
operation: "submit",
onClose: () => {},
},
};
```
This `onClose` callback function is called when the 'operation' defined in the
Action is complete and closes the iframe. It can also be used to perform cleanup
actions, such as resetting the state of the parent application or updating the
UI.
### displayAsModal
Display Flatfile as a full screen modal or inline on the page.
Toggling off this property will update the wrapper css to make the iframe fully
inline thereby hiding the close button. All of the child divs will no longer
include: `.flatfile_displayAsModal` which removes all the dialog styling.
The default width and height of the container can be overriden in `style.css`.
```typescript src/client.js
const flatfileOptions = {
publishableKey,
displayAsModal: false,
//additional props
};
```
```css public/styles.css
.flatfile_iframe-wrapper {
min-width: 768px;
min-height: 600px;
width: 992px;
height: 600px;
}
```
# Simple
Source: https://flatfile.com/docs/documentation/sdks/reference/simple
simple-mode embedded properties
### publishableKey
Publishable key accessed via Flatfile dashboard > Developer settings.
```typescript src/client.js
const flatfileOptions = {
publishableKey,
//additional props
};
```
### spaceBody
Pass in space options to configure a new Space. Find all available parameters
for spaceBody in the [API
Reference](https://reference.flatfile.com/api-reference/spaces/create).
```typescript src/client.js
const flatfileOptions = {
publishableKey,
//additional props, see Create Space endpoint for all the full list of properties (https://reference.flatfile.com/api-reference/spaces/create)
spaceBody: {
name: "New Space",
namespace: "Red",
},
};
```
### themeConfig
Theme values for the Space, sidebar and data table.
Theme your Space to create a custom look and feel to match your brand using the
actual CSS variables referenced in the app. See all options in our
[Theming Reference](/learning-center/guides/theming).
```typescript src/client.js
const theme = {
root: {
primaryColor: "red",
},
// see theming reference
};
```
## sheet\[]
Configuring your sheet for Portal is telling us about the data you’re looking to
receive and in what format. You can specify things like fields being required or
unique so that some of the built-in validation can help expedite your data
import.
A sheet contains 3 elements, a `name`, `slug` and your `fields`. We go over
Fields in detail in our Blueprint guide, but in the below example, you will see
that each field has a `key`, `type` and `label`. Some keys also use the
`constraints` property to specify things like a field needing to be `required`
to have a value or be a `unique` value in the field values. You might also
notice in the below example that a field type of `enum` also using an additional
`config` property where the `options` in the enum list will be defined.
Example sheet:
```js index.html
const sheet = {
name: 'Contacts',
slug: 'contacts',
fields: [
{
key: 'name',
type: 'string',
label: 'Name',
constraints: [
{ type: 'required' }
]
},
{
key: 'email',
type: 'string',
label: 'Email',
constraints: [
{ type: 'required' },
{ type: 'unique' },
]
},
{
key: 'age',
type: 'number',
label: 'Age',
},
{
key: 'employmentStatus',
type: 'boolean',
label: 'Currently employed?',
},
{
key: 'birthYear',
type: 'enum',
label: 'Birth Year',
config: {
options: [
{ value: '1999', label: '1999' },
{ value: '2000', label: '2000' },
// more here
]
}
}
],
}
```
Now that we’ve built out the sheet above, we can add it to the full example
below. Notice that in addition to the above sheet being added into the script
tag, we’ve also referenced the `sheet` in the `flatfileOptions`. Quick note: The
`flatfileOptions` object does expect to get a `sheet` object with that name. So
you can name your `sheet` whatever you like, but if your sheet is
`const mySheet = {}` then you will just need your `flatfileOptions` object to
look like `sheet: mySheet` instead.
Adding on to the Full Example:
```html index.html
Your Page Title
We are going to import your data now. Click the button below to start your import
```
## onRecordHook()
Now that we have configured our Sheet, we will look at the Data Validations
piece. Data Validations in the Portal setup can be written inside of the
`onRecordHook` callback of the `flatfileOptions` object. This callback will run
on each record at initialization and then on any subsequent changes of the data.
The callback you write will get the `record` passed in by default and you can
update that record with any updates you’d like to make and return it as the
output of the callback and that will update the record in the UI. For the
purposes of being super basic in this example, let’s say you like to store the
`name` property in your Database as all uppercase values. To ensure that this
happens, we can write a simple function using JavaScript’s `toUpperCase` method
to make sure all values continually get put to all uppercase values.
```js index.html
// taking just the flatfileOptions object out from the full example
const flatfileOptions = {
publishableKey,
sheet,
onRecordHook: (record) => {
const name = record.get('name')
if (name) {
record.set('name', name.toUpperCase())
record.addInfo('name', 'We updated the values to all uppercase')
}
return record
},
}
```
What you’ll notice from the above is that we can use a built-in `record.get()`
method with the fields key value to initially get the record’s value. We can
then use the `record.set()` method to set the new value for the field and then
we can optionally provide some context onto the record with `record.addInfo()`
to set a message on the field.
Quick note on the additional info being added: You can add additional info on a
field with several different validation levels. You can use `addInfo` as a
courtesy to let your end use know what transformations is happening with the
data, you could use `addWarning` to provide a warning message (the main
difference between the two is how noticeable a field is after running the hook
in the UI) and finally, you can use `addError` which not only highlights
everything in red, but changes the data’s validation state to being not valid
and making the end use fix the issue.
Here’s the above added to the full code snippet:
```html index.html
Your Page Title
We are going to import your data now. Click the button below to start your import
```
## onSubmit()
Flatfile has many different ways to handle and process data back to you once an
import has happened, but the easiest in this context is the built-in `onSubmit`
callback where the data is provided to you. Using this callback will add a
button labelled "Submit" in the top right of your space for users to click on to
initiate the record submission process. Due to the nature of this callback and
button, it cannot be used with our
[action constraints](/learning-center/concepts/actions#optional-parameters).
This `onSubmit` callback passes in an object that has the records for you to
send wherever you might need to store them. For example purposes, we are going
to simply log the results, but this is where you could send results to your
server or pass them back to your application.
When the data is finished and available, the `onSubmit` method will provide you
with a `sheet` that you can use the built-in methods like `allData()`,
`validData()`, `errorData()`, `inChunks()` and `stream()` to get your data.
Here’s a quick reference table for the available methods and what they do.
| Method | Example Usage | Return description | Additional notes |
| ----------------------------------- | ------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `allData()` | `sheet.allData()` | This returns the data that is being imported. This uses the [GET Records](https://reference.flatfile.com/api-reference/records/get) endpoint under the hood, which has a default pageSize of 10k records. The `pageSize` and `pageNumber` parameters can be used with this method to paginate through your data or increase the records returned. | All of the data does come with a way of determining whether the individual record was valid or not, but it returns all records regardless of validity. |
| `validData()` | `sheet.validData() ` | This returns all records that pass are flagged as being valid and leaves off all invalid records. Like `sheet.allData()`, this has a default pageSize of 10k records. | |
| `errorData()` | `sheet.errorData()` | This will return only those records that are considered invalid and leaves off any valid records. Like `sheet.allData()`, this has a default pageSize of 10k records. | |
| `inChunks(cb, {chunkSize: number})` | `sheet.inChunks((data) => // do something with each chunk, { chunkSize: 100 })` | The inChunks method will run the callback function provided on each chunk of data until all chunks have been processed. | The default chunk size is 1000 |
| `stream(cb)` | `sheet.stream((data) => // do something to each chunk of the stream)` | The stream method will run the callback on every 1000 records available until all records have been processed | |
With all those methods brought to light, let’s simply use the `allData` method
to get all the records at once and then console log them in the process.
```js index.html
// just working within the confines of the flatfileOptions object
const flatfileOptions = {
publishableKey,
sheet,
onRecordHook: (record) => {
const name = record.get('name')
if (name) {
record.set('name', name.toUpperCase())
record.addInfo('name', 'We updated the values to all uppercase')
}
return record
},
onSubmit: async ({sheet}) => {
const data = await sheet.allData()
console.log(data)
}
}
```
Now let’s put it all together in the full example:
```html index.html
Your Page Title
We are going to import your data now. Click the button below to start your
import
```
The above example is a fully working example that only needs to have your
publishable key changed out for it to work as expected.
As a quick reminder, this is just the baseline of what Flatfile is capable of
doing. You can check out more resources in our
[documentation here.](https://flatfile.com/docs/overview)
# Quickstart
Source: https://flatfile.com/docs/documentation/sdks/vue/new_space
create a new Space every time Flatfile is opened
For synchronous data import/exchange completed in one session, create a new
Space each time Flatfile is opened. This suits situations where a clean slate
for every interaction is preferred.
## Before you begin
Make a new directory.
```bash
mkdir example-flatfile-vuejs-embed
```
Go into that directory.
```bash
cd example-flatfile-vuejs-embed
```
Follow prompts from the `init` command.
```bash
npm init
```
```bash
npm i vue @flatfile/vue @flatfile/plugin-record-hook @flatfile/listener && npm i --save-dev vite @vitejs/plugin-vue vue-tsc typescript
```
### Create your file structure
Setup your app to look something like this:
```
├── src/
├── App.vue
├── config.ts
├── main.ts
├── shims-vue.d.ts (optional if you have errors when importing App.vue)
├── vite-env.d.ts
├── styles.css (optional)
└── listener.ts (wait to add this)
├── index.html
├── tsconfig.json
├── tsconfig.node.json
├── vite.config.ts
├── .env (for your keys)
├── package.json <--- already created
└── package-lock.json <--- already created
```
In this file structure, your main directory is `src`.
Depending on how you want to handle styling your application, you can have a
`styles.css` file in your src directory, create sub-directories to contain style
files, or even create directories outside of `src` if you prefer. The only
requirement is that the files can be imported into vue files.
The heart of your vue app will be `App.vue`, as this will configure and render
your component(s) to the index.html.
### Build your importer
#### 1. Add a Flatfile Button
Add a button to your application to open Flatfile in a modal. Pass in your
`publishableKey` and a new Space will be created on each page load. Optionally,
add the content here to your `styles.css`.
```HTML index.html (full page)
Vue
```
```CSS src/styles.css
/* Styles for home page */
html,
body {
height: 100%;
margin: 0;
padding: 0;
font-family: sans-serif;
background: #090b2b;
color: #fff;
}
#app {
display: flex;
align-items: center;
justify-content: center;
height: 100%;
}
/* End of styles for home page */
:root {
--ff-primary-color: #4c48ef !important;
--ff-secondary-color: #616a7d !important;
--ff-text-color: #090b2b !important;
--ff-dialog-border-radius: 4px !important;
--ff-border-radius: 5px !important;
--ff-bg-fade: rgba(0, 0, 0, 0.2) !important;
}
/* The default mount element */
/* #flatfile_iFrameContainer {
} */
/* A div around the iframe that contains Flatfile */
/* .flatfile_iframe-wrapper {
} */
/* The actual iframe that contains Flatfile */
/* #flatfile_iframe {
} */
.flatfile-close-button {
display: none !important;
}
/* Begin style overrides for when Flatfile is displayed as a modal */
/* This class gets appended to the flatfile_iframe-wrapper div */
.flatfile_displayAsModal {
padding: 50px !important;
width: calc(100% - 100px) !important;
height: calc(100vh - 100px) !important;
}
.flatfile_iframe-wrapper.flatfile_displayAsModal {
background: var(--ff-bg-fade) !important;
}
/* The close button in top right to close modal */
.flatfile_displayAsModal .flatfile-close-button {
display: block !important;
margin: 20px !important;
}
/* The icon for the close button in top right to close modal */
.flatfile_displayAsModal .flatfile-close-button svg {
fill: var(--ff-secondary-color) !important;
}
/* The actual iframe that contains Flatfile */
.flatfile_displayAsModal #flatfile_iframe {
border-radius: var(--ff-border-radius);
}
/* Begin style overrides for when you cancel out of the Flatfile modal */
/* The outer container of the modal that opens when you cancel out of Flatfile */
.flatfile_outer-shell {
background-color: var(--ff-bg-fade) !important;
border-radius: var(--ff-border-radius) !important;
}
/* The inner container of the modal that opens when you cancel out of Flatfile */
/* .flatfile_inner-shell {
} */
/* The white box inside the modal that opens when you cancel out of Flatfile */
.flatfile_modal {
border-radius: var(--ff-dialog-border-radius) !important;
}
/* The container for the buttons you see in the close modal */
/* .flatfile_button-group {
} */
/* Style the buttons you see in the close modal */
/* .flatfile_button {
} */
/* The "yes, cancel" button you see in the close modal */
.flatfile_primary {
border: 1px solid var(--ff-primary-color) !important;
background-color: var(--ff-primary-color) !important;
color: #fff;
}
/* The "no, stay" button you see in the close modal */
.flatfile_secondary {
color: var(--ff-secondary-color) !important;
}
/* The heading text you see in the close modal */
.flatfile_modal-heading {
color: var(--ff-text-color) !important;
}
/* The description text you see in the close modal */
.flatfile_modal-text {
color: var(--ff-secondary-color) !important;
}
/* End style overrides for when you cancel out of the Flatfile modal */
/* End style overrides for when Flatfile is displayed as a modal */
/* The container of the error component */
/* .ff_error_container {
}*/
/* The heading text you see in the error component */
/* .ff_error_heading {
}*/
/* The description text you see in the error component */
/* .ff_error_text {
}*/
```
### 2. Configure your project
In your environment file, you'll need to define your keys. In this example, you
will need your `publishableKey`. You can find these under "Developer settings"
when logged into [platform.flatfile.com](https://platform.flatfile.com).
You can create your `.env` file via the template below, as well as your
`App.vue`, `tsconfig.json`, `tsconfig.node.json`, and `vite.config.ts`, which
are the remaining top-level config files. After these are initialized properly
we can configure the app files under `src`.
```bash .env
# VITE_ is in front of these variables to make them accessible to App.vue
VITE_ENVIRONMENT_ID = "your_environment_id"
VITE_PUBLISHABLE_KEY = "your_publishable_key"
```
```JSON tsconfig.json
{
"compilerOptions": {
"baseUrl": ".",
"module": "ESNext",
"target": "es2016",
"useDefineForClassFields": true,
"lib": ["ES2020", "ESNext", "DOM", "DOM.Iterable"],
"skipLibCheck": true,
"allowJs": true,
"strict": true,
"esModuleInterop": true,
"incremental": true,
"moduleResolution": "node",
"resolveJsonModule": true,
"forceConsistentCasingInFileNames": true,
},
"include": ["src/**/*.ts", "src/**/*.d.ts", "src/**/*.tsx", "src/**/*.vue"],
"references": [{ "path": "./tsconfig.node.json" }]
}
```
```JSON tsconfig.node.json
{
"compilerOptions": {
"composite": true,
"skipLibCheck": true,
"module": "es2015",
"moduleResolution": "bundler",
"allowSyntheticDefaultImports": true
},
"include": ["vite.config.ts"]
}
```
```TypeScript vite.config.ts
import { defineConfig } from "vite";
import vue from "@vitejs/plugin-vue";
// https://vitejs.dev/config/
export default defineConfig({
plugins: [vue()],
});
```
### 3. Initialize Flatfile
In your `App.vue` you'll need to pass in a minimum of the `publishableKey` from
your `.env`.
Also, add the content here to your `main.ts`, and `vite-env.d.ts`. Optionally
you can also add `shims-vue.d.ts` if you have the need.
```HTML src/App.vue
```
```TypeScript src/main.ts
import { createApp } from "vue";
import App from "./App.vue";
createApp(App).mount("#app");
```
```TypeScript src/vite-env.d.ts
///
```
```TypeScript src/shims-vue.d.ts
declare module "*.vue";
```
### 4. Start your client
1. Update your package.json to include this script:
```JSON
"scripts": {
"dev": "vite",
"build": "vue-tsc && vite build",
"preview": "vite preview"
},
```
2. Now, start your front end by heading to the terminal and running the
following command.
```bash
npm run dev
```
Your console should display a message like the following:
```
> vue@0.0.0 dev
> vite
VITE v4.5.0 ready in 215 ms
➜ Local: http://localhost:5173/
➜ Network: use --host to expose
➜ press h to show help
```
3. To view your app, copy the url and paste it into your browser. Click the
button and see that an Empty Space gets created.
### 5. Build a workbook
Now, let’s build a Workbook inside the Space for next time.
Add your `config.ts` file, and place the following code in it. After you have
done so, import the configuration to `App.vue`, and update `spaceProps` to have
the value of `config` under `workbook` (this will be around `line 36`). This
config file has the configuration settings for your workbook, so feel free to
edit however necessary to meet your needs.
```TypeScript src/config.ts
import { Flatfile } from "@flatfile/api";
export const config: Pick<
Flatfile.CreateWorkbookConfig,
"name" | "sheets" | "actions"
> = {
name: "Employees workbook",
sheets: [
{
name: "TestSheet",
slug: "TestSheet",
fields: [
{
key: "first_name",
type: "string",
label: "First name",
constraints: [
{
type: "required",
},
],
},
{
key: "last_name",
type: "string",
label: "last name",
},
{
key: "email",
type: "string",
label: "Email",
},
],
actions: [
{
label: "Join fields",
operation: "TestSheet:join-fields",
description: "Would you like to join fields?",
mode: "foreground",
confirm: true,
},
],
},
],
actions: [
{
label: "Submit",
operation: "TestSheet:submit",
description: "Would you like to submit your workbook?",
mode: "foreground",
primary: true,
confirm: true,
},
],
};
```
### 6. Transform Data
Next, we'll listen for data changes and respond using an event listener.
1. Add a `src/listener.ts` file with this simple `recordHook`.
2. Update `App.tsx` to import the listener.
Once you add this code, when a change occurs, we'll log the entered first name
and update the last name to "Rock." You'll immediately see this begin to work
when you add or update any records. Learn more about
[Handling Data](/learning-center/guides/handling-data)
```TypeScript src/listeners/listener.ts
import api from "@flatfile/api";
import { FlatfileListener } from "@flatfile/listener";
import { recordHook } from "@flatfile/plugin-record-hook";
/**
* Example Listener
*/
export const listener = FlatfileListener.create((listener) => {
listener.on("**", (event) => {
console.log(`Received event:`, event);
});
listener.use(
recordHook("TestSheet", (record) => {
const firstName = record.get("firstName");
console.log({ firstName });
record.set("lastName", "Rock");
return record;
})
);
listener.filter({ job: "workbook:TestSheet:submit" }, (configure) => {
configure.on("job:ready", async ({ context: { jobId } }) => {
try {
await api.jobs.ack(jobId, {
info: "Getting started.",
progress: 10,
});
// Make changes after cells in a Sheet have been updated
console.log("Make changes here when an action is clicked");
await api.jobs.complete(jobId, {
outcome: {
acknowledge: true,
message: "This is now complete.",
next: {
type: "wait",
},
},
});
} catch (error: any) {
console.error("Error:", error.stack);
await api.jobs.fail(jobId, {
outcome: {
message: "This job encountered an error.",
},
});
}
});
});
});
```
```JavaScript src/App.vue
```
### 7. Match your brand
By attaching a `themeConfig` to `spaceProps` in `src/App.tsx`, we will now
override colors in your Space to match your brand. See all of the options here
in the [Theming Reference](/learning-center/guides/theming).
```typescript
themeConfig: {
root: {
primaryColor: "red",
textColor: "white",
logo: "https://images.ctfassets.net/hjneo4qi4goj/gL6Blz3kTPdZXWknuIDVx/7bb7c73d93b111ed542d2ed426b42fd5/flatfile.svg",
},
},
```
### 8. Add customizations
You can stop here or you can [view our full reference](../reference/common) to
see all the ways you can customize your importer.
## Example Project
Find this Vue.js example project in the Flatfile GitHub repository.
Clone the Flatfile Vue.js tutorial here.
# Reuseable Spaces
Source: https://flatfile.com/docs/documentation/sdks/vue/reuse_space
reuse a Space when Flatfile is opened
For applications meant to use the same space consistently, open an existing
space each time Flatfile is opened. This suits situations where consistently
editing a dataset is preferred.
## Before you begin
If you have already tried our [Embed a New Space](./new_space) guide you will
notice this guide departs heavily, so you will want to create this in a new
directory, as translation would be more difficult than creating from scratch.
Make a new Directory.
```bash
mkdir example-flatfile-vuejs-embed
```
Go into that directory.
```bash
cd example-flatfile-vuejs-embed
```
Follow prompts from the `init` command.
```bash
npm init
```
Install Packages
```bash
npm i @flatfile/api @flatfile/listener @flatfile/plugin-record-hook @flatfile/vue dotenv ejs express vue && npm i --save-dev @types/express @types/node @vitejs/plugin-vue concurrently nodemon ts-node typescript vite vue-tsc
```
### Create your file structure
The file structure of this application is fairly complex, as it requires a
dedicated server and API. You can create an api endpoint externally if desired
to keep your own application less complex, however this application is an
all-in-one application server and api solution.
In order to create this solution, create a file structure like the following:
```text
├── server/
├── assetsRouter.ts
├── homepageRouter.ts
└── index.ts
├── src/
├── components/
├── ExistingSpace.vue
├── Home.vue
└── NewSpace.vue <--- optional, requires listener.ts and config.ts
├── listeners/
└── listener.ts <--- optional depending on if NewSpace.vue is included
├── styles/
└── styles.css
├── workbooks/
└── config.ts <--- optional depending on if NewSpace.vue is included
├── App.vue
├── main.ts
├── shims-vue.d.ts <--- may be optional
└── vite-env.d.ts
├── views/
└── index.html.ejs
├── .env
├── .gitignore
├── nodemon.json
├── package.json <--- already created
├── package-lock.json <--- already created
├── tsconfig.json
└── vite.config.js
```
In this file structure, you will primarily work out of `/src` especially in
`/src/components`. Many of thsese files are configuration files which are mostly
set once and not touched again. So while there may be a lot, they won't be
actively managed.
### Update your .env
Update your `.env`. `FLATFILE_API_KEY` is your Secret Key and `SPACE_ID` is the
Space you want to open in the importer. This can be found on your Dashboard
where it lists your Spaces. You may also want to include your `PUBLISHABLE_KEY`
and `ENVIRONMENT_ID` if you want to have your app create new spaces as well.
Note in the below example some of the variables are prefixed with `VITE_`. This
is because Vite requires the prefix to access them at runtime. The
`FLATFILE_API_KEY` should never be accessible from the browser for security
reasons, and should not have this prefix.
```bash
# Required for creating new spaces
VITE_PUBLISHABLE_KEY = "pk_12345"
VITE_ENVIRONMENT_ID = "us_env_12345"
# Required for embedding existing spaces
VITE_SPACE_ID = "us_sp_12345"
FLATFILE_API_KEY = "sk_12345"
```
### Build your importer
Before starting to build the application, it is important to note that there are
a few moving parts, and it won't be able to start up between each update that is
being made. This guide will endeavor to break out files into groups, so that
after everything is updated as directed, the app should start up without errors.
#### 1. Set up Configuration Files
This app has several configuration files that must be set up before you can get
started with development. Set them up as shown below.
```JavaScript vite.config.js
import vue from '@vitejs/plugin-vue'
import { defineConfig } from 'vite'
// https://vitejs.dev/config/
export default defineConfig({
plugins: [vue()],
build: {
manifest: true,
rollupOptions: {
input: './src/main.ts',
},
},
})
```
```JSON tsconfig.json
{
"compilerOptions": {
"baseUrl": ".",
"outDir": "dist",
"module": "es6",
"target": "es2016",
"lib": [
"dom",
"dom.iterable",
"esnext",
"es2020"
],
"useDefineForClassFields": true,
"skipLibCheck": true,
"allowJs": true,
"strict": true,
"esModuleInterop": true,
"incremental": true,
"moduleResolution": "node",
"resolveJsonModule": true,
"forceConsistentCasingInFileNames": true,
"noEmit": true,
"isolatedModules": true,
"jsx": "preserve",
},
"ts-node": {
"esm": true,
"experimentalSpecifierResolution": "node"
},
"include": [
"**/*.ts",
"**/*.d.ts",
"**/*.tsx",
"src/**/*.vue"
],
}
```
```JSON nodemon.json
{
"watch": ["server"],
"ext": "ts,json",
"ignore": ["src/**/*.spec.ts"],
"exec": "ts-node ./server/index.ts"
}
```
```gitignore .gitignore
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
pnpm-debug.log*
lerna-debug.log*
node_modules
dist
dist-ssr
*.local
# Editor directories and files
.vscode/*
!.vscode/extensions.json
.idea
.DS_Store
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?
.env
```
You will also want to add some scripts to your `package.json` to start the app.
Add the following scripts:
```JSON pacakge.json (snippet)
"scripts": {
"dev:frontend": "vite",
"dev:backend": "nodemon",
"dev": "concurrently 'npx tsc --watch' 'npm:dev:frontend' 'npm:dev:backend'",
"start": "NODE_ENV=production ts-node server/index.ts",
"build": "vite build"
},
```
Please note that these scripts include files that haven't been created yet.
These will be created in the following steps.
#### 2. Create the EJS HTML and Server
This app has a home page to route the user to create a new space or embed an
existing space. While the former choice is optional and not the focus of this
guide (although instructions on how to include this function will be included),
the files edited here set up functionality for embedding existing spaces, and as
such is still necessary.
This application utilizes ejs, which helps build dynamic html pages. This allows
us to dynamically create both development and production environment outputs
from one file. Set this up at `/views/index.html.ejs`
This file wil not work before the server is functioning, but is required for the
rest of the app to work.
```EJS views/index.html.ejs
Vite Vue
<% if (environment === 'production') { %>
<% } %>
<% if (environment === 'production') { %>
<% } else { %>
<% } %>
```
Next lets create the server that will act as the backend of the application.
This will be necessary to serve the pages as well as get the existing space, as
due to security reasons the Secret Key cannot be exposed to the browser at any
time.
```TypeScript server/index.ts
import dotenv from 'dotenv';
import express from "express";
import path from "path";
import { FlatfileClient } from "@flatfile/api";
dotenv.config();
const port = process.env.PORT || 3000;
const publicPath = path.join(path.resolve(), "public"); //If your project doesn't require a public path this may not be necessary
const distPath = path.join(path.resolve(), "dist");
app.get('/api/spaces/:id', async (_req, res)=>{
const {id} = _req.params;
const flatfile = new FlatfileClient({
token: process.env.FLATFILE_API_KEY,
environment: 'https://platform.flatfile.com/v1/',
});
try {
const space = await flatfile.spaces.get(id);
res.json({ space });
} catch (error) {
console.error("Error retrieving space:", error);
res.status(500).json({ error: "Failed to retrieve space" });
}
})
app.listen(port, () => {
console.log("Server listening on port", port);
});
```
In addition to the `server/index.ts`, you will need to create two routers - One
for the homepage and one for the assets. This is because the frontend runs on a
different port than you'll be accessing the app from, so the app will need a bit
of help resolving pathing.
Create `server/homepageRouter.ts` and `server/assetsRouter.ts` as follows, then
update `server/index.ts`.
```TypeScript server/homepageRouter.ts
import express from "express";
import fs from 'fs/promises';
import path from "path";
const router = express.Router();
const environment = process.env.NODE_ENV;
router.get('/*', async (_req, res)=>{
const data = {
environment,
manifest: await parseManifest()
};
res.render('index.html.ejs', data);
});
const parseManifest = async () => {
if(environment !== "production") return {};
const manifestPath = path.join(path.resolve(), 'dist', "manifest.json");
const manifestFile = await fs.readFile(manifestPath, 'utf8');
return JSON.parse(manifestFile);
}
export default router;
```
```TypeScript server/assetsRouter.ts
import express from "express";
const router = express.Router();
const supportedAssets = ["svg", "png", "jpg", "png", "jpeg", "mp4", "ogv"];
const assetExtensionRegex = () => {
const formattedExtensionList = supportedAssets.join("|");
return new RegExp(`/.+\.(${formattedExtensionList})$`);
};
router.get(assetExtensionRegex(), (req, res) => {
res.redirect(303, `http://localhost:5173/src${req.path}`);
});
export default router;
```
```TypeScript server/index.ts
import dotenv from 'dotenv';
import express from "express";
import path from "path";
import assetsRouter from "./assetsRouter";
import homepageRouter from './homepageRouter';
import { FlatfileClient } from "@flatfile/api";
dotenv.config();
const port = process.env.PORT || 3000;
const publicPath = path.join(path.resolve(), "public");
const distPath = path.join(path.resolve(), "dist");
const app = express();
app.get('/api/spaces/:id', async (_req, res)=>{
const {id} = _req.params;
const flatfile = new FlatfileClient({
token: process.env.FLATFILE_API_KEY,
environment: 'https://platform.flatfile.com/v1/',
});
try {
const space = await flatfile.spaces.get(id);
res.json({ space });
} catch (error) {
console.error("Error retrieving space:", error);
res.status(500).json({ error: "Failed to retrieve space" });
}
})
if (process.env.NODE_ENV === "production") {
app.use("/", express.static(distPath));
} else {
app.use("/", express.static(publicPath));
app.use("/src", assetsRouter);
}
app.use(homepageRouter);
app.listen(port, () => {
console.log("Server listening on port", port);
});
```
#### 3. Build the App Component
Before building the components, you'll need to make a couple TypeScript
Declaration files. This is so your IDE won't throw unnecessary errors at you
during development.
You'll need a `src/vite-env.d.ts` file and a `src/shims-vue.d.ts` file in order
to sort your the declarations. Create them as follows:
```TypeScript src/vite-env.d.ts
///
```
```TypeScript src/shims-vue.d.ts
declare module '*.vue';
```
Next you'll need a `src/main.ts` file to mount `src/App.vue` to your HTML.
Create it and your App Component as shown below.
```TSX src/App.vue
<Flatfile />
```
```TypeScript src/main.ts
import { createApp } from 'vue'
import App from './App.vue'
createApp(App).mount('#app')
```
#### 4. Build your Home & Existing Space Component
This app is built around a Home component that is loaded first and lets you
navigate the app. That component is fairly basic and should look like this:
```TypeScript src/components/Home.vue
```
Now you'll need to build the component that will get your space and return it to
the browser.
The component should end up looking something like this:
```TypeScript src/components/ExistingSpace.vue
```
You'll need to update your `App.vue` to include the existing space component. It
should end up looking like the example below:
```TypeScript src/App.vue
<Flatfile />
Embed Flatfile in just a few lines of code.
```
It may be wise to set up your styles in `src/styles/styles.css`. These should
end up looking like the example below:
```CSS src/styles/styles.css
/* Styles for home page */
html,
body {
height: 100%;
margin: 0;
padding: 0;
font-family: sans-serif;
background: #090b2b;
color: #fff;
}
#app {
display: flex;
align-items: center;
justify-content: center;
height: 100%;
}
/* End of styles for home page */
:root {
--ff-primary-color: #4c48ef !important;
--ff-secondary-color: #616a7d !important;
--ff-text-color: #090b2b !important;
--ff-dialog-border-radius: 4px !important;
--ff-border-radius: 5px !important;
--ff-bg-fade: rgba(0, 0, 0, 0.2) !important;
}
nav {
}
nav a {
color: white;
margin: 1em;
}
.button-container a {
border: rgb(101, 201, 101) 1px solid;
border-radius: 15px;
padding: 0.5em;
background-color: rgb(101, 201, 101);
text-decoration: none;
transition: 0.25s;
}
.button-container a:hover {
border: rgb(56, 139, 56) 1px solid;
border-radius: 15px;
padding: 0.5em;
background-color: rgb(56, 139, 56);
text-decoration: none;
}
.new-space-button-container {
margin-top: 1em;
}
/* The default mount element */
/* #flatfile_iFrameContainer {
} */
/* A div around the iframe that contains Flatfile */
/* .flatfile_iframe-wrapper {
} */
/* The actual iframe that contains Flatfile */
/* #flatfile_iframe {
} */
.flatfile-close-button {
display: none !important;
}
/* Begin style overrides for when Flatfile is displayed as a modal */
/* This class gets appended to the flatfile_iframe-wrapper div */
.flatfile_displayAsModal {
padding: 50px !important;
width: calc(100% - 100px) !important;
height: calc(100vh - 100px) !important;
}
.flatfile_iframe-wrapper.flatfile_displayAsModal {
background: var(--ff-bg-fade) !important;
}
/* The close button in top right to close modal */
.flatfile_displayAsModal .flatfile-close-button {
display: block !important;
margin: 20px !important;
}
/* The icon for the close button in top right to close modal */
.flatfile_displayAsModal .flatfile-close-button svg {
fill: var(--ff-secondary-color) !important;
}
/* The actual iframe that contains Flatfile */
.flatfile_displayAsModal #flatfile_iframe {
border-radius: var(--ff-border-radius);
}
/* Begin style overrides for when you cancel out of the Flatfile modal */
/* The outer container of the modal that opens when you cancel out of Flatfile */
.flatfile_outer-shell {
background-color: var(--ff-bg-fade) !important;
border-radius: var(--ff-border-radius) !important;
}
/* The inner container of the modal that opens when you cancel out of Flatfile */
/* .flatfile_inner-shell {
} */
/* The white box inside the modal that opens when you cancel out of Flatfile */
.flatfile_modal {
border-radius: var(--ff-dialog-border-radius) !important;
}
/* The container for the buttons you see in the close modal */
/* .flatfile_button-group {
} */
/* Style the buttons you see in the close modal */
/* .flatfile_button {
} */
/* The "yes, cancel" button you see in the close modal */
.flatfile_primary {
border: 1px solid var(--ff-primary-color) !important;
background-color: var(--ff-primary-color) !important;
color: #fff;
}
/* The "no, stay" button you see in the close modal */
.flatfile_secondary {
color: var(--ff-secondary-color) !important;
}
/* The heading text you see in the close modal */
.flatfile_modal-heading {
color: var(--ff-text-color) !important;
}
/* The description text you see in the close modal */
.flatfile_modal-text {
color: var(--ff-secondary-color) !important;
}
/* End style overrides for when you cancel out of the Flatfile modal */
/* End style overrides for when Flatfile is displayed as a modal */
/* The container of the error component */
/* .ff_error_container {
}*/
/* The heading text you see in the error component */
/* .ff_error_heading {
}*/
/* The description text you see in the error component */
/* .ff_error_text {
}*/
```
Remember to add `import './styles/styles.css'` to your `src/App.vue` with the
other imports to apply the styles.
#### 5. Optional - Build your New Space Component
If you want to have a page to create new spaces, you can create a component to
do that fairly easily. The vue code should look fairly similar to Existing
Space, just with some different configuration settings. See below for the
necessary changes.
```TypeScript src/components/NewSpace.vue
```
```TypeScript src/App.vue
```
#### 6. Start your client
Now you should be able to start your app. To load it in dev mode and ensure
everything works proprly, run:
```bash
npm run dev
```
If you have any errors, now is your time to fix them, otherwise - you're ready
to deploy!
#### 7. Customize
You can stop here or you can [view our full reference](../reference/common) to
see all the ways you can customize your importer.
## Example Project
Find this Vue.js example project in the Flatfile GitHub repository.
Clone the Full Flatfile VueJs tutorial here.
# Overview
Source: https://flatfile.com/docs/learning-center/architecture/about
the anatomy of Flatfile
Our platform follows a hierarchical structure designed to provide secure, organized access to various resources and automation capabilities. This document outlines the core components and their relationships.
```mermaid
flowchart TD
%% Node definitions
App[App]
Env[Environment]
subgraph ENV[" "]
style ENV fill:none,stroke-width:2px
subgraph SPACE[" "]
style SPACE fill:none,stroke-width:2px
Space[Space]
Workbook[Workbook]
Document[Document]
File[File]
Sheet[Sheet]
end
Secret[Secret]
subgraph AGENT[" "]
style AGENT fill:none,stroke-width:2px
Agent[Agent]
Job[Job]
end
end
%% Relationships
App <--> Env
Env --> Space
Env --> Agent
Env --> Secret
Space --> Secret
Space --> File
Space --> Document
Space --> Workbook
Workbook --> Sheet
%% Job relationships
Agent --> Job
%% Override default styles to remove colors
classDef default fill:none,stroke:#333,stroke-width:2px
style ENV fill:none,stroke:#333
style SPACE fill:none,stroke:#333
style AGENT fill:none,stroke:#333
```
Learn more about Flatfile by understanding our core elements.
Manage and coordinate data import workflows across environments.
Secure, isolated contexts for data import workflows.
Micro-applications for content and data storage.
Containers data Sheets with defined schemas.
Event-driven functions for automating workflows.
Discrete tasks executed by Agents in response to events.
# Apps
Source: https://flatfile.com/docs/learning-center/architecture/apps
the anatomy of an App
## Apps
Apps are the highest-level organizational unit in Flatfile, designed to manage and coordinate data import workflows across different environments. They serve as containers for organizing related Spaces and provide a consistent configuration that can be deployed across your development pipeline.
* Deploy configurations from development to staging to production
* Maintain consistent behavior across environments
* Test and validate changes before production deployment
* Create reusable templates for common import scenarios.
* Define standard configurations that can be instantiated multiple times
* Share best practices across implementations
* Group related Spaces under a common namespace
* Manage access control at the namespace level
* Organize workspaces by team, department, or function
* Define standard field mappings and validations
* Configure data transformation rules
* Set up automated workflows using Agents
* Manage custom constraints and validations
# Environments
Source: https://flatfile.com/docs/learning-center/architecture/environments
use Environments for testing and authentication.
{/* prettier-ignore */}
Environments are isolated entities and are intended to be a safe place to create and
test different configurations. Environments serve as self-contained, secure domains
where diverse configurations can be created and tested. By default, a development
and a production environment are set up.
| isProd | Name | Description |
| ------- | ------------- | -------------------------------------------------------------------------------------------- |
| *false* | `development` | Use this default environment, and it’s associated test API keys, as you build with Flatfile. |
| *true* | `production` | When you're ready to launch, create a new environment and swap out your keys. |
The development environment does not count towards your paid credits.
## Creating an Environment
Your `publishableKey` and `secretKey` are specific to an environment therefore
to create a new Environment, you'll have to use a personal access token.
1. Open Settings
2. Click to Personal Tokens
3. You can use the key pair in there to create an access token like:
```bash
curl -X POST https://platform.flatfile.com/v1/auth -H 'Content-Type: application/json' -d '{"clientId":"1234-1234", "secret":"1234-1234"}'
```
4. The response will include an `accessToken`. You can present that as your
**Bearer `token`** in place of the `secretKey`.
[Or click here to create an environment in the dashboard](https://platform.flatfile.com/dashboard)
## Guest Authentication
Environments support two types of guest authentication:
1. `magic_link`: This method dispatches an email to your guests, which includes
a magic link to facilitate login.
2. `shared_link`: This method transforms the Space URL into a public one,
typically used in conjunction with embedded Flatfile.
### Additional Info
Should the `guestAuthentication` be left unspecified, both `magic_link` and
`shared_link` types are enabled by default.
It's important to note that `guestAuthentication` settings can be applied at
both Environment and Space levels. However, in case of any conflicting settings,
the authentication type set at the Space level will take precedence over the
Environment level setting. This flexibility enables customization based on
specific needs, ensuring the right balance of accessibility and security.
## Secret and publishable keys
All Accounts have two key types for each environment. Learn when to use each
type of key:
| Type | Id | Description |
| --------------- | ---------------------- | ----------------------------------------------------------------------------------------------------------------------- |
| Secret key | `sk_23ghsyuyshs7dcrty` | **On the server-side:** Store this securely in your server-side code. Don’t expose this key in an application. |
| Publishable key | `pk_23ghsyuyshs7dcert` | **On the client-side:** Can be publicly-accessible in your application's client-side code. Use when embedding Flatfile. |
The publishable key only has permissions to create a Space.
# Spaces
Source: https://flatfile.com/docs/learning-center/architecture/spaces
the anatomy of a Space
Flatfile Spaces are micro-applications, each having their own database,
filestore, and auth. Use Spaces to integrate Flatfile into your data exchange
workflow, whether that happens directly in your application or as part of a
currently offline process.
## Anatomy
A Space is comprised of Workbooks, Files, Users, Documents, Themes and Metadata.
### Reference
[See API Reference](https://reference.flatfile.com/api-reference/spaces/create)
#### `Workbooks` *object*
A Space contains any number of Workbooks. Each
[Workbook](/learning-center/architecture/workbooks)
is a Blueprint-defined database.
Add the label 'pinned' to your workbook to ensure it appears as the first
workbook in the sidebar.
#### `Files` *object*
Files are uploaded directly to a Space.
#### `Users` *object*
By default, all global Admins in Flatfile have access to all Spaces.
Additionally, you can configure a Space to allow for either temporary
(`shared_link`) or named (`magic_link`) Guest Users.
#### `Documents` *object*
Documents are custom pages that are mounted on the Space and display in the
sidebar of a Space in the order of their addition to the Space.
#### `Theme` *object*
Theming of Flatfile is controlled at the Space level. A comprehensive list of
Theme options are available [here](/learning-center/guides/theming).
#### `Metadata` *object*
Spaces can be created and updated with Metadata. This could include a
company-specific identifier such as a UUID for use in API operations in the
Space.
# Workbooks
Source: https://flatfile.com/docs/learning-center/architecture/workbooks
the anatomy of a Workbook
Workbooks can function as a module within other Apps, and soon, they will also
be available for standalone use. We're close to launching Standalone
Workbooks, offering more flexibility and options.
[Learn more](https://flatfile.com/platform/workbooks/) about the
upcoming features and capabilities.
## Overview
Workbooks are analogous to a database, and like a database, they are configured
with a type-strict schema. A Workbook replaces the spreadsheet template you may
share with your users today when requesting data during the data collection
phase of customer onboarding or other file-based data exchange processes. Unlike
a spreadsheet, it's designed to allow your team and users to validate, correct,
and import data with real-time feedback.
With Workbooks, you can:
1. Accept data from many file types beyond CSVs. (And you can write your own
file extractors if you can't find a plugin.)
2. Automatically apply any validation rules a developer has previously defined.
3. Provide end users the ability to add, remove, review, filter, and correct any
data imported into a Workbook.
4. Define, in code, at least one primary Action that submits the reviewed data
to a destination API, database, or workflow step of your choosing.
***
## Anatomy
A Workbook is comprised of one or more
[Sheets](/learning-center/blueprint/field-types)
plus any actions you want to take on those Sheets.
### Reference
[See API Reference](https://reference.flatfile.com/api-reference/workbooks/create-a-workbook)
#### `Sheets` *array*
A schema contains Sheets. Like tables in a database or sheets in a spreadsheet,
Sheets isolate different data schemas.
[Learn more about Blueprint](/learning-center/blueprint/about)
#### `Fields` *array*
Sheets contain
[Fields](/learning-center/blueprint/field-types)
. Fields are defined properties of your schema (eg a first\_name on a contact
sheet.)
[Learn more about Blueprint](/learning-center/blueprint/about)
#### `Actions` *array*
Workbooks and Sheets can also contain Actions. Actions are developer-defined
operations or macros invoked by end users on selected data like Submit to API,
Download as PDF.
[Learn more about Actions](/learning-center/concepts/actions)
#### `Settings` *object*
`trackChanges: true`: Disables actions on Sheet and Workbook when there is a
commit that has not been completed (often when Hooks are running)
`noMappingRedirect: true`: When the user drags a file into a Sheet in this
Workbook, usually the user is redirected to the Mapping step. This setting
prevents that redirect. This is especially useful when you are using the
[automap plugin](/learning-center/guides/automap).
[Learn more about Blueprint](/learning-center/blueprint/about)
#### `Labels` *array*
To have a workbook appear first in the sidebar, give it the label `pinned` (it
will even have a pin icon!). You can also add other custom labels to your
workbooks as needed.
# Blueprint
Source: https://flatfile.com/docs/learning-center/blueprint/about
define your data schema
Blueprint is your guide to structuring data in Flatfile. It defines how your
data should look, behave, and connect—from simple field validations to complex
relationships between datasets. Think of it as a smart schema that helps you
collect exactly the data you need, in exactly the format you want it.
Configure the basic behavior of your Sheet with Sheet-level options.
Learn about the different field types and how to use them.
Learn about the available constraints and how they work.
Establish connections between Sheets and fields.
Configure field, Sheet, and Workbook level user capabilities.
Unique presentation for a field in the UI. Learn more about
`allowAdditionalFields`
# Access
Source: https://flatfile.com/docs/learning-center/blueprint/access
control user interactions with your data
# Sheet level access
With `access` you can control Sheet-level access for users.
### `"*"` *(default)*
A user can use all access actions to this Sheet
Warning: If you use `"*"` access control, users will gain new functionalities
as we expand access controls. Use an exhaustive list today to block future
functionality from being added automatically.
### `"add"`
A user can add a record(s) to the Sheet
### `"delete"`
A user can delete record(s) from the Sheet
### `"edit"`
A user can edit records (field values) in the Sheet
### `"import"`
A user can import CSVs to this Sheet
### ``
If no parameters are specified in the access array, sheet-level readOnly access
will be applied. No data can be added, edited, imported, or removed.
```json
{
"sheets": [
{
"name": "Contacts",
"slug": "contacts",
"access": ["add", "edit"]
// Define fields
}
]
}
```
# Field-level access
### `readonly`
On a field level you can restrict a user's interaction with the data to
`readonly`. This feature is useful if you're inviting others to view uploaded
data, but do not want to allow them to edit that field. See the example in
context below:
```json
"fields": [
{
"key": "salary",
"type": "number",
"config": {
"decimal_places": 2
},
"readonly": true
}
]
```
# Constraints
Source: https://flatfile.com/docs/learning-center/blueprint/constraints
system level validation rules
## Field-Level
Field-level constraints allow you to indicate additional validations that will
be applied to fields of a Sheet. These constraints are in addition to the
implicit constraints based on types of data. (eg: String, Number)
For example, a value is required for every record (RequiredConstraint), or every
record must contain a unique value for the field (UniqueConstraint).
### Required
Required Constraints indicate that the given field **must** be provided with a
non-null value.
A `null` value in this case constitutes an empty cell.
```json
// within the context of a create Workbook API call
"fields": [
{
"key": "firstName",
"type": "string",
"label": "First Name",
"constraints": [
{
"type": "required"
}
]
}
]
```
### Unique
Unique Constraints indicate that the given field value must only appear **once**
in all the values for that field.
`null` values will appear many times because they are not considered unique as
there is no value to compare.
```json
// within the context of a create Workbook API call
"fields": [
{
"key": "internalCustomerId",
"type": "number",
"label": "Internal Customer Id",
"constraints": [
{
"type": "unique"
}
]
}
]
```
### Computed
The computed constraint hides the given field from the mapping process, so your
users will not be able to map incoming columns to this field.
```json
// within the context of a create Workbook API call
"fields": [
{
key: 'age',
type: 'number',
label: 'Age',
description: "The number of years since the person's birth date",
constraints: [
{
type: 'computed',
},
],
},
]
```
## Sheet-Level
### Composite Uniqueness
Composite Uniqueness is a feature where a combination of two or more fields must
be unique across an entire Sheet.
To implement composite uniqueness, add the following parameters to the
`constraints` property:
* `name` *defines the name of the constraint*
* `fields` *array of field keys to consider*
* `type` *defines the type of the constraint*
* `strategy` *defines how to determine uniqueness* {" "} Can either be: {" "}
`concat` *concatenates values from each field defined inside the `fields`
array* {" "} or `hash` *creates a sha1 hash using of the values of each field*
```json
"sheets": [
{
"name": "Products",
"description": "A list of products available for sale",
"slug": "products",
"constraints": [
{
"name": "constraint name",
"fields": ["field_key_one", "field_key_two"],
"type": "unique",
"strategy": "concat"
}
],
}
]
```
# Field Options
Source: https://flatfile.com/docs/learning-center/blueprint/field-options
configurable properties for a Field
The system name of this field. Primarily informs JSON and egress structures.
One of `string`, `number`, `boolean`, `date`, `enum`, `reference`. Defines the
handling of this property. See [Field Types](./field-types) for more
information.
A user-facing descriptive label designed to be displayed in the UI such as a
table header.
A long form description of the property intended to be displayed to an end
user.
An array of system level Validation Rules meant to be applied after hooks are
run.
Configuration relevant to the type of column. See property documentation
below.
Will allow multiple values and store / provide the values in an array if set.
Not all field types support arrays.
Arbitrary object of values to pass through to hooks and egress
# Field Types
Source: https://flatfile.com/docs/learning-center/blueprint/field-types
supported field types
## `string` *(default)*
A property that should be stored and read as a basic string.
```json
{
"key": "currency",
"label": "Product Code",
"type": "string"
},
```
## `number`
A property that should be stored and read as either an integer or floating point
number. Database engines should look at the configuration to determine ideal
storage format.
The number of decimal places to preserve accuracy to. Overages should be
automatically rounded with a warning. A hook can pre-format to accomplish
floor or ceiling.
```json
{
"key": "price",
"label": "Retail Price",
"type": "number",
"config": {
"decimal_places": 2
}
},
```
## `reference`
Defines a reference to another sheet. Links should be established automatically
by the matching engine or similar upon an evaluation of unique or similar
columns between datasets.
The full path reference to another sheet/table configuration. Must be in the
same workbook.
The type of relationship this defines. Can be `has-one`
## `enum`
Defines an enumerated list of options for the user to select from. Matching
tooling attempts to resolve incoming data assignment to a valid option. The
maximum number of options for this list is `100`. For larger lists, users should
use the `reference` or future `lookup` types.
Permit the user to create new options for this specific field.
The field to sort the options by (label, value, ordinal)
An array of valid options the user can select from
The value or ID of this option. This value will be sent in egress
A visual label for this option, defaults to value if not provided
An arbitrary JSON object to be associated with this option and made available to hooks
The ordinal position of this option in the list. Only used if `sortBy` is set to `ordinal`. Options are sorted in ascending order.
```json
{
"key": "status",
"label": "Status",
"type": "enum",
"config": {
"options": [
{
"value": "active"
},
{
"value": "inactive",
"label": "Disabled",
"meta": {
"foo": "bar"
}
}
]
}
},
```
## `boolean`
A `true` or `false` value type. Matching engines should attempt to resolve all
common ways of representing this value and it should usually be displayed as a
checkbox.
```json
{
"key": "is_active",
"label": "Active",
"type": "boolean"
}
```
## `date`
Store a field as a GMT date. Data hooks must convert this value into a
`YYYY-MM-DD` format in order for it to be considered a valid value. Datetime is
not currently available, but will be a separately supported Field Type in the
future, and will include timezone.
# Relationships
Source: https://flatfile.com/docs/learning-center/blueprint/relationships
referencing data in other Sheets
The relationship defines a reference to another Sheet, similar to a Foreign Key.
Links should be established automatically by the mapping engine (the logic that
creates a mapping plan automatically when a user maps their data into a
workbook) or similar upon an evaluation of unique or similar columns between
datasets.
## Options
* **ref** - Full path reference to another Sheet/table configuration. Must be in
the same Workbook
* **relationship** - The type of relationship this defines. Currently, only
`has-one` is supported
## Inheritance
This example introduces two fields linked by a relationship, inheriting the
respective email addresses of the father and mother from another Sheet.
```json workbooks.create
// in the context of create Workbook API call
{
"fields": [
{
"key": "father",
"type": "reference",
"config": {
"ref": "parents",
"key": "email",
"relationship": "has-one"
}
},
{
"key": "mother",
"type": "reference",
"config": {
"ref": "parents",
"key": "email",
"relationship": "has-one"
}
}
]
}
```
## Lookups
This example mimics a vlookup operation by fetching a value from a linked record
and applying it to a designated field in the original record.
The linked record is designated by a reference field in the original record, and
the sought-after value is defined by a lookup field in the linked record.
When a lookup value is located, it's placed into the original record's target
field. Furthermore, a message is added to the record, detailing the origin of
the retrieved value.
The record to perform the vlookup on.
The name of the reference field on the original record that links to the
linked record.
The name of the field on the linked record that contains the value to be
looked up.
The name of the field on the original record that the lookup value should be
set to.
```jsx
// in the context of listener.use(recordHook){}
export const vlookup = (
record,
referenceFieldKey,
lookupFieldKey,
targetFieldKey,
) => {
console.log("Initial Record: " + JSON.stringify(record));
const links = record.getLinks(referenceFieldKey);
console.log("Linked Record: " + JSON.stringify(links));
const lookupValue = links?.[0]?.[lookupFieldKey];
console.log(
"Reference Fields Key: " +
referenceFieldKey +
" : " +
"LookUpValue: " +
lookupValue,
);
if (isNotNil(lookupValue)) {
record.set(targetFieldKey, lookupValue);
record.addInfo(
targetFieldKey,
`${targetFieldKey} set based on ${referenceFieldKey}.`,
);
}
};
```
# Sheet Options
Source: https://flatfile.com/docs/learning-center/blueprint/sheet-options
configurable properties for a Sheet
The name of your Sheet as it will appear to your end users.
A sentence or two describing the purpose of your Sheet.
A unique identifier for your Sheet. Used to reference your Sheet in code, for
example in a [Record Hook](/learning-center/guides/handling-data#example).
A boolean specifying whether or not this sheet is read only. Read only sheets
are not editable by end users.
When this is set to `true`, your Sheet will be able to accept additional
fields beyond what you specify in its configuration. These additional fields
can be added via API, or by end users during the file import process. These
fields will have a `treatment` of "user-defined".
An array specifying the access controls for this Sheet. [Read more about
access controls](/learning-center/blueprint/access).
This is where you define your Sheet's data schema. The collection of fields in
your Sheet determines the shape of data you wish to accept.
{""}
Learn more about [Field types](/learning-center/blueprint/field-types), [Constraints](/learning-center/blueprint/constraints),
and [Relationships](/learning-center/blueprint/relationships).
An array of actions that end users can perform on this Sheet. [Read more about
actions](/learning-center/concepts/actions).
Use `metadata` to store any extra contextual information about your Sheet.
Must be valid JSON. This metadata can be updated at any time through the workbook update endpoint by including the metadata field in the sheet configuration.
Use `mappingConfidenceThreshold` to configure the minimum required confidence
for mapping jobs targeting that sheet. This can be used to tune the behavior
of fuzzy matches, or disable fuzzy matches entirely. **Must be greater than 0
and less than or equal to 1 (exact match)**
Use the `constraints` option to define constraints for a specific Sheet.
Learn more about [Constraints](/learning-center/blueprint/constraints).
# Actions
Source: https://flatfile.com/docs/learning-center/concepts/actions
The anatomy of Actions
An Action is a code-based operation that runs where that Action is mounted. Actions run when a user clicks the corresponding user prompt in Flatfile. Additional Actions can be mounted on a Sheet, a Workbook, or a file by a developer.
## Types of Actions
### Built-in Actions
Workbooks, Sheets and files come with five default built-in actions:
1. Mapping data from one workbook to another (moving data from a workbook to a Workbook)
2. Deleting data from a Workbook (or the entire file)
3. Exporting (downloading) data from a Workbook (or a file)
4. Find and Replacing data in a Sheet
5. Applying a Mutate function to data in a Sheet
When these built-in actions are clicked, they create Jobs. You can listen for these events and take appropriate actions if required. Learn more about [Jobs](/learning-center/concepts/jobs).
### Developer-Created Actions
Additional Actions can be mounted on a Sheet, a Workbook, or a file. When an action is clicked, it will create a job `operation` on the `domain` it is mounted.
For example, an Action with the property `operation: 'my-action'` placed on a Workbook would spawn a job called: `workbook:my-action`. This is the job you'd listen for to respond accordingly.
## The Anatomy of an Action
### Required Parameters
All Actions contain the the information needed to let a user run the Action. This includes:
A unique identifier for the Action that is used by the listener to determine
what work to do as part of the resulting Job.
The text that will be displayed to the user in the UI as a button or menu
item.
### Optional Parameters
Optionally, an Action can also contain extra details about how it appears in the
UI. These include:
Whether the Action is a primary Action for the resource. Depending on the
resource, primary Actions are displayed in the UI in a more prominent way.
When set to true, a modal is shown to confirm the Action.
The text that will be displayed to the user when a confirmation modal is used.
Phrase this as a question.
The icon to be displayed. Default is a lightning bolt. If you want to omit the
icon, use `'none'`. All available icons are listed here: [Flatfile
icons](/reference/icons). If the `label` is empty, then an icon will always
be displayed.
Setting this will display text in the UI for both buttons and list items as a
tooltip on hover.
Setting this will display text on disabled actions \[based on the state of the action] in the UI for
both buttons and list items as a tooltip on hover.
`[{ type: 'error' }]`
`[{ type: 'info' }]`
`[{ type: 'hasAllValid' }]`: Adding this constraint will disable a Sheet or Workbook Action when there are invalid
records.
`[{ type: 'hasSelection' }]`: Adding this constraint will disable a Sheet or Workbook Action when no records are selected.
`[{ type: 'hasData' }]`: Adding this constraint will disable a Sheet or Workbook Action when there are no records.
Can be `foreground`, `background` or `toolbarBlocking`. Foreground mode will
prevent interacting with the entire resource until complete. toolbarBlocking
disables the sheet-level toolbar and Column Header menus, while still allowing
users to enter records manually.
### Usage
An Action with all of the above properties would look like this:
```js
{
operation: 'my-action',
label: 'My Action',
primary: true,
confirm: true,
description: 'Are you sure you want to run this action?',
constraints: [{ type: 'hasAllValid' }, { type: 'hasSelection' }],
mode: 'foreground'
tooltip: 'Click to run action'
}
```
## inputForm
When initiating an action, there may be instances where additional information is required from the end user to successfully complete the intended task. For example, you might want to enable users to specify the name of the file they intend to export.
In such cases, if you configure input fields for your action, a secondary dialog will be presented to the end user, prompting them to provide the necessary information. Once the user has entered the required details, they can proceed with the action seamlessly
An object representing the input form configuration for the action.
The type of the input form. Accepts: `simple`
### fields
An array of field objects representing the input form fields.
The key for the field.
The label for the field.
The type of the field. Accepts: `string` | `textarea` | `number` | `boolean` | `enum`
The default value for the field.
A description of the field.
#### config
An object containing configuration options for the field.
An array of options for the field, each represented as an object.
The value or ID of the option.
A visual label for the option.
A short description of the option.
An arbitrary JSON object associated with the option.
#### constraints
An array of constraints for the field.
The type of constraint. Accepts: `required`
### Usage
An Action that requests input from the end user using `inputForm` would look like this. Your listener would then grab this data from the Job and take action on it. Learn more about [Using Actions with Inputs](/learning-center/guides/actions#actions-with-input-forms).
## Learn more
### Guides
Learn more about Workbook, Sheet, and File Actions.
Run an operation on the entire dataset
Run an operation at the Sheet level
Attach additional actions to a File
### Example Projects
Find an Actions example in the Flatfile GitHub repository.
Clone the Actions example in Typescript
Clone the Actions example in Javascript
# Events
Source: https://flatfile.com/docs/learning-center/concepts/events
The anatomy of Events.
The Flatfile platform encompasses a comprehensive suite of tools. Firstly, it offers a REST API, which can be accessed through the [REST API](https://reference.flatfile.com/), empowering users to effortlessly manipulate resources. Additionally, an Event bus is available for seamless subscription to resource-related notifications.
To simplify the process of utilizing the Flatfile API, the Flatfile PubSub Client serves as a lightweight wrapper. It enables developers to seamlessly trigger API calls upon receiving Events from any PubSub driver, thus ensuring a smooth and streamlined integration process.
#### The anatomy of an Event
Flatfile Events adhere to a standardized structure, and Event listeners have the flexibility to handle Events within Flatfile using any of the following syntaxes.
[See API Reference](https://reference.flatfile.com/api-reference/events/create-an-event)
### Using Events
Once an Event is received, it is routed to any awaiting listeners which are added with `addEventListener()` or its alias `on()`.
An Event context is passed to an **EventFilter**
```typescript
export type EventFilter = Record>;
// example event context
{
context: {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"actorId": "us_ag_1234"
}
}
```
#### Building with events
By leveraging the robust capabilities of our @flatfile/listener library, we can seamlessly intercept any Event and execute a callback function based on the values encapsulated within the Event itself.
```typescript
listener.on(
"job:ready",
{ job: "space:configure" },
async (event: FlatfileEvent) => {
//do something here
}
);
```
## Event Topics
Utilize the reference provided by the response of each Event topic to determine the desired
outcome or plan subsequent actions accordingly.
### Workbook
#### `workbook:created`
Called when a new workbook is created.
```json
{
"domain": "workbook",
"topic": "workbook:created",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"jobId": "us_jb_1234",
"actorId": "us_usr_1234"
}
}
```
#### `workbook:updated`
Called when workbook metadata is updated. For example, updating the name of a
workbook. Adding data to a workbook does not emit this Event.
```json
{
"domain": "workbook",
"topic": "workbook:updated",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"workbookId": "us_wb_1234",
"actorId": "us_ag_1234"
}
}
```
#### `workbook:deleted`
Called when a workbook is deleted.
```json
{
"domain": "workbook",
"topic": "workbook:deleted",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"workbookId": "us_wb_1234",
"actorId": "us_ag_1234"
}
}
```
#### `workbook:expired`
Called when a workbook is expired.
```json
{
"domain": "workbook",
"topic": "workbook:expired",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"workbookId": "us_wb_1234"
}
}
```
### Document
#### `document:created`
Called when a document is created on a workbook.
```json
{
"domain": "document",
"topic": "document:created",
"context": {
"actorId": "us_key_1234",
"spaceId": "us_sp_1234",
"accountId": "us_acc_1234",
"documentId": "us_dc_1234",
"environmentId": "us_env_1234"
}
}
```
#### `document:updated`
Called when a document is updated on a workbook.
```json
{
"domain": "document",
"topic": "document:updated",
"context": {
"actorId": "us_key_1234",
"spaceId": "us_sp_1234",
"accountId": "us_acc_1234",
"documentId": "us_dc_1234",
"environmentId": "us_env_1234"
}
}
```
#### `document:deleted`
Called when a document is deleted on a workbook.
```json
{
"domain": "document",
"topic": "document:deleted",
"context": {
"actorId": "us_key_1234",
"spaceId": "us_sp_1234",
"accountId": "us_acc_1234",
"documentId": "us_dc_1234",
"environmentId": "us_env_1234"
}
}
```
### Sheet
#### `sheet:updated`
Called when a sheet is updated. For example, running sheet level validations.
Adding data to a sheet does not emit this event.
```json
{
"domain": "sheet",
"topic": "sheet:updated",
"context": {
"slugs": { "sheet": "contacts" },
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"workbookId": "us_wb_1234",
"sheetId": "us_sh_1234",
"sheetSlug": "contacts",
"actorId": "us_ag_XRGmuGfL"
}
}
```
#### `sheet:deleted`
Called when a sheet is deleted.
```json
{
"domain": "sheet",
"topic": "sheet:deleted",
"context": {
"slugs": { "sheet": "contacts" },
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"workbookId": "us_wb_1234",
"sheetId": "us_sh_1234",
"sheetSlug": "contacts",
"actorId": "us_ag_XRGmuGfL"
}
}
```
### Snapshot
#### `snapshot:created`
Called when a snapshot is created of the sheet. This can be triggered by the
API endpoint or when AI Assist is used to modify a sheet.
```json
{
"domain": "sheet",
"topic": "snapshot:created",
"context": {
"appId": "us_app_1234",
"accountId": "us_acc_1234",
"snapshotId": "us_ss_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"workbookId": "us_wb_1234",
"sheetId": "us_sh_1234",
"actorId": "us_ag_XRGmuGfL"
}
}
```
### Record
#### `commit:created`
Commits are created when a cell in a Record (which is in a Sheet) is created
or updated.
```json
{
"domain": "workbook",
"topic": "commit:created",
"context": {
"slugs": { "sheet": "MasterWorkbook/Data" },
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"workbookId": "us_wb_1234",
"sheetId": "us_sh_1234",
"sheetSlug": "MasterWorkbook/Data",
"versionId": "us_vr_1234",
"actorId": "us_usr_1234"
}
}
```
#### `commit:completed`
This only runs when trackChanges is enabled on the workbook. This event fires
when a commit has completed and any record changes have finished.
```json
{
"domain": "workbook",
"topic": "commit:completed",
"context": {
"appId": "us_app_1234",
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"workbookId": "us_wb_1234",
"sheetId": "us_sh_1234",
"versionId": "us_vr_1234",
"actorId": "us_usr_1234",
"commitId": "us_vr_1234"
}
}
```
#### `layer:created`
Called when a new layer is created within a commit. A layer is a
programmatically generated artifact of a commit. For example, when a data hook
or formatting rule applies formatting or validation automatically. Layers
cannot be updated or deleted, but you can ignore them to see the original
source data.
### File Upload
#### `file:created`
Called when a file upload begins or a new file is created.
```json
{
"domain": "file",
"topic": "file:created",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"fileId": "us_fl_1234",
"actorId": "us_usr_1234"
}
}
```
#### `file:updated`
Called when a file is updated. For example, when a file has been extracted
into a workbook.
```json
{
"domain": "file",
"topic": "file:updated",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"fileId": "us_fl_1234",
"actorId": "us_acc_1234"
},
"payload": {
"status": "complete",
"workbookId": "us_wb_1234"
}
}
```
#### `file:deleted`
Called when a file is deleted.
```json
{
"domain": "file",
"topic": "file:deleted",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"fileId": "us_fl_1234",
"actorId": "us_usr_1234"
}
}
```
#### `file:expired`
Called when a file is expired.
```json
{
"domain": "file",
"topic": "file:expired",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"fileId": "us_fl_1234"
}
}
```
### Jobs
#### `job:created`
Called when a new job is first created. Some jobs will enter an optional
planning state at this time. A job with 'immediate' set to true will skip the
planning step and transition directly to 'ready.'
```json
{
"domain": "job", //job domain
"topic": "job:created",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"jobId": "us_jb_1234",
"actorId": "us_usr_1234"
},
"payload": {
"job": "space:configure", //this is domain:operation
"info": null,
"domain": "space", //event domain
"status": "created",
"operation": "configure"
}
}
```
#### `job:ready`
Called when a new job is ready to execute. Either the job has a complete plan
of work or the job is configured to not need a plan. This is the only event
most job implementations will care about. Once a ready job is acknowledged by
a listener, it transitions into an executing state.
```json
{
"domain": "job", //job domain
"topic": "job:ready",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"jobId": "us_jb_1234",
"actorId": "us_usr_1234"
},
"payload": {
"job": "space:configure", //this is domain:operation
"info": null,
"domain": "space", //event domain
"status": "ready",
"operation": "configure"
}
}
```
#### `job:updated`
Called when a job is updated. For example, when a listener updates the state
or progress of the job. The event will emit many times as the listener
incrementally completes work and updates the job.
```json
{
"domain": "job", //job domain
"topic": "job:updated",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"jobId": "us_jb_1234",
"actorId": "us_usr_1234"
},
"payload": {
"job": "space:configure", //this is domain:operation
"info": null,
"domain": "space", //event domain
"status": "ready",
"operation": "configure"
}
}
```
#### `job:completed`
Called when a job has completed.
```json
{
"domain": "job", //job domain
"topic": "job:completed",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"jobId": "us_jb_1234",
"actorId": "us_usr_1234"
},
"payload": {
"job": "space:configure", //this is domain:operation
"info": "Configuring space...",
"domain": "space", //event domain
"status": "complete",
"operation": "configure"
}
}
```
#### `job:outcome-acknowledged`
Called to trigger workflow actions after the user has acknowledged that the
job has completed or failed. Background jobs will skip this step.
```json
{
"domain": "job", //job domain
"topic": "job:outcome-acknowledged",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"workbookId": "us_wb_1234",
"sheetId": "us_sh_1234",
"jobId": "us_jb_1234",
"actorId": "us_usr_1234"
},
"payload": {
"job": "sheet:buildSheet", //this is domain:operation
"info": null,
"domain": "sheet", //event domain
"status": "failed",
"operation": "buildSheet"
}
}
```
#### `job:failed`
Called when a job fails.
```json
{
"domain": "job", //job domain
"topic": "job:failed",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"workbookId": "us_wb_1234",
"sheetId": "us_sh_1234",
"jobId": "us_jb_1234",
"actorId": "us_usr_1234"
},
"payload": {
"job": "sheet:buildSheet", //this is domain:operation
"info": null,
"domain": "sheet", //event domain
"status": "failed",
"operation": "buildSheet"
}
}
```
#### `job:deleted`
Called when a job is deleted.
```json
{
"domain": "job",
"topic": "job:deleted",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"jobId": "us_jb_1234",
"actorId": "us_ag_1234"
}
}
```
### Agents
#### `agent:created`
Called when a new agent is deployed.
```json
{
"domain": "space",
"topic": "agent:created",
"context": {
"actorId": "us_ag_1234",
"accountId": "us_acc_1234",
"environmentId": "us_env_1234"
}
}
```
#### `agent:updated`
Called when an agent is updated.
```json
{
"domain": "space",
"topic": "agent:updated",
"context": {
"actorId": "us_ag_1234",
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"agentId": "us_ag_1234"
}
}
```
#### `agent:deleted`
Called when an agent is deleted.
```json
{
"domain": "space",
"topic": "agent:deleted",
"context": {
"actorId": "us_ag_1234",
"accountId": "us_acc_1234",
"environmentId": "us_env_1234"
"agentId": "us_ag_1234"
}
}
```
### Space
#### `space:created`
Called when a new space is created.
```json
{
"domain": "space",
"topic": "space:created",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"actorId": "us_usr_1234"
}
}
```
#### `space:updated`
Called when a space is updated.
```json
{
"domain": "space",
"topic": "space:updated",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"actorId": "us_usr_1234"
}
}
```
#### `space:deleted`
Called when a space is deleted.
```json
{
"domain": "space",
"topic": "space:deleted",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"workbookId": "us_wb_1234",
"actorId": "us_usr_1234"
}
}
```
#### `space:expired`
Called when a space is expired.
```json
{
"domain": "space",
"topic": "space:expired",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"workbookId": "us_wb_1234"
}
}
```
#### `space:archived`
Called when a space is archived.
```json
{
"domain": "space",
"topic": "space:archived",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234"
}
}
```
#### `space:guestAdded`
Called when a guest is added.
```json
{
"domain": "space",
"topic": "space:guestAdded",
"context": {
"actorId": "us_usr_1234",
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234"
}
}
```
#### `space:guestRemoved`
Called when a guest's access is revoked from a space.
```json
{
"domain": "space",
"topic": "space:guestRemoved",
"context": {
"actorId": "us_usr_1234",
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234"
}
}
```
### Secret
#### `secret:created`
Called when a new secret is created.
```json
{
"domain": "secret",
"topic": "secret:created",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"actorId": "us_usr_1234"
}
}
```
#### `secret:updated`
Called when a secret is updated.
```json
{
"domain": "secret",
"topic": "secret:updated",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"actorId": "us_usr_1234"
}
}
```
#### `secret:deleted`
Called when a secret is deleted.
```json
{
"domain": "secret",
"topic": "secret:deleted",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"actorId": "us_usr_1234"
}
}
```
### Data clips
#### `data-clip:created`
Called when a new data clip is created.
```json
{
"domain": "data-clip",
"topic": "data-clip:created",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"actorId": "us_usr_1234"
},
"payload": {
"accountId": "us_acc_1234",
"dataClipId": "us_acc_1234",
"stauts": "OPEN",
}
}
```
#### `data-clip:updated`
Called when a data clip's details are updated.
This is not called when records are updated in a data clip.
````json
{
"domain": "data-clip",
"topic": "data-clip:created",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"actorId": "us_usr_1234"
},
"payload": {
"accountId": "us_acc_1234",
"dataClipId": "us_acc_1234",
"stauts": "OPEN",
}
}
#### `data-clip:deleted`
Called when a data clip is deleted.
```json
{
"domain": "data-clip",
"topic": "data-clip:created",
"context": {
"accountId": "us_acc_1234",
"environmentId": "us_env_1234",
"spaceId": "us_sp_1234",
"actorId": "us_usr_1234"
},
"payload": {
"accountId": "us_acc_1234",
"dataClipId": "us_acc_1234",
"stauts": "OPEN",
}
}
````
# Jobs
Source: https://flatfile.com/docs/learning-center/concepts/jobs
The anatomy of Jobs
In Flatfile, a Job represents a large unit of work performed asynchronously on a resource
such as a file, [Workbook](/learning-center/architecture/workbooks),
or [Sheet](/learning-center/blueprint/about#terms).
The Jobs workflow provides visibility into the status and progress of your Jobs,
allowing you to monitor and troubleshoot the data processing pipeline.
## Types of Jobs
Jobs can be triggered in a number of ways, most commonly in response to user activity.
Jobs are then managed via [Listeners](/learning-center/concepts/listeners), which receive [Events](/learning-center/concepts/events) published by Flatfile in response to activity.
There are three types of Jobs on the Flatfile Platform:
Attached to custom actions
Created dynamically in your listener
Flatfile system jobs
### Action Based Jobs
Actions are developer-defined operations that can be mounted on a number of domains (including Sheets, Workbooks, Documents, and Files).
Mounting an Action means attaching a custom operation to that domain.
That operation can then be triggered by a user event (clicking a button or selecting a menu item).
When an Action is triggered a `job:ready` Event for a Job named
`[domain]:[operation]` is published. Your [Listener](/learning-center/concepts/listeners)
can then be configured to respond to that Action via it's Event.
To run an Action based job, two configurations are necessary.
First, create an [Action](/learning-center/concepts/actions) on a domain. Here's an example of a Workbook containing an Action:
```jsx javascript
api.workbook.create({
name: "October Report Workbook",
actions: [
{
label: "Export Data",
description: "Send data to destination system",
operation: "export",
type: "file",
},
],
});
```
Then, create a [Listener](/learning-center/concepts/listeners) to respond to the Action:
```jsx javascript
listener.on(
"job:ready",
{ job: "workbook:export" },
({ context: { jobId } }, payload) => {
const { jobId } = event.context;
try {
await api.jobs.ack(jobId, {
info: "Starting submit job...",
progress: 10,
});
// Custom code here
await api.jobs.complete(jobId, {
outcome: {
message: "Submit Job was completed succesfully.",
},
});
} catch (error) {
await api.jobs.fail(jobId, {
outcome: {
message: "This Job failed.",
},
});
}
}
);
```
Note that the Listener is listening for the `job:ready` event, for the `workbook:export` Job, which was defined in our Workbook.
### Custom Jobs
Another trigger option is to create a Custom Job via SDK/API. In the SDK, Jobs are created by calling the `api.jobs.create()` method.
Creating a custom Job in your Listener enables any Event to trigger a Job.
Here's an example of creating a custom Job in a Listener:
```jsx javascript
listener.on(
"commit:created",
{ sheet: "contacts" },
async ({ context: { workbookId, sheetId } }) => {
const { data } = await api.jobs.create({
type: "workbook",
operation: "myCustomOperation",
trigger: "immediate",
source: workbookId
});
}
);
```
Note that the trigger for this Listener is set to immediate, which means that the Job will be created and executed immediately upon the Event firing.
Therefore, we should have our Listener ready to respond to this Job:
```jsx javascript
listener.on(
"job:ready",
{ job: "workbook:myCustomOperation" },
async ({ context: { jobId, workbookId }, payload }) => {
try {
await api.jobs.ack(jobId, {
info: "Starting my custom operation.",
progress: 10,
});
// Custom code here.
await api.jobs.complete(jobId, {
outcome: {
message: "Sucessfully completed my custom operation.",
},
});
} catch {
await api.jobs.fail(jobId, {
outcome: {
message: "Custom operation failed.",
},
});
}
}
);
```
Please note that Flatfile does not support Job plans for custom Actions.
### System Jobs
Internally, Flatfile uses Jobs to power many of the features of the Flatfile Platform, such as extraction, record mutation, and AI Assist.
Here are some examples of Jobs that the Flatfile Platform creates and manages on your behalf:
| Job Name | Description |
| --------------- | ------------------------------------------------------------- |
| `Extract` | Extracts data from the specified source. |
| `Map` | Maps data from it's ingress format to Blueprint fields. |
| `DeleteRecords` | Deletes records from a dataset based on specified criteria. |
| `Export` | Exports data to a specified format or destination. |
| `MutateRecords` | Alters records in a dataset according to defined rules. |
| `Configure` | Sets up or modifies the configuration of a Space. |
| `AiAssist` | Utilizes AI to assist with tasks such as data categorization. |
| `FindReplace` | Searches for specific values and replaces them. |
The`space:configure` Job is the only system-level Job published for developer
consumption. This is a special Job that allows developers to configure their
Spaces dynamically. See our [Space
Configuration](/learning-center/guides/dynamic-configurations) documentation for more
information.
## The Anatomy of a Job
### Lifecycle Events
Jobs fire the following Events during their lifecycle. In chronological order, the Job Events are:
| Event | Description |
| ------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `job:created` | Fires when a Job is created, but before it does anything. |
| `job:ready` | Fires when a Job is ready to move into the execution stage, but before it does anything. |
| `job:updated` | Fires when there is an update to a Job while it is executing. |
| `job:completed` OR `job:failed` | `job:completed` fires when a Job is completed successfully, `job:failed` fires if a Job is completed but fails. One of these events will fire upon Job completion, but never both. |
| `job:outcome-acknowledged` | Fires when a user acknowledges the completion of a Job through a UI popup. |
You can listen on any of these events in your [Listener](/learning-center/concepts/listeners), but the most common event to listen for is `job:ready`.
For more on managing your Job see [Working with Jobs](#working-with-jobs).
### Required Parameters
Workbook, File, Sheet, Space
`export`, `extract`, `map`, `delete`, etc
The id of the data source (FileId, WorkbookId, or SheetId)
### Optional Parameters
`manual` or `immediate`
The id of the data target (if any)
`created`, `planning`, `scheduled`, `ready`, `executing`, `complete`,
`failed`, `cancelled`
A numerical or percentage value indicating the completion status of the Job.
An estimated completion time. The UI will display the estimated processing
time in the foreground Job overlay.
| | |
| ------------- | --------------------------------------------------------------------------------- |
| `label` | A user-friendly name for the action. |
| `description` | A string describing the action. |
| `tooltip` | Additional info on hover. |
| `schedule` | `weekly`, `daily`, `hourly` |
| `operation` | The operation to perform. |
| `mode` | `foreground, background, toolbarBlocking` |
| `primary` | Whether this action is considered the primary or default action. |
| `confirm` | Whether user confirmation is required before the action is executed. |
| `icon` | Icon representative of the action. |
| `inputForm` | `{ type: simple, fields: { key, label, description, type, *config, constraints }` |
| `*config` | `{ options: { value, label, description, color, icon, meta } }` |
Input parameters for the Job type.
| | |
| -------- | ---------------------------------------------------------- |
| resource | Identifier for the subject resource. |
| type | resource, collection |
| query | The query used to locate or identify the subject resource. |
| params | Parameters that detail or modify the subject query. |
| | |
| ----------- | ------------------------------------------------------- |
| heading | The heading text summarizing the outcome of the action. |
| acknowledge | Indicates if the outcome requires user acknowledgment. |
| message | Detailed message describing the outcome. |
| buttonText | Label for the button to acknowledge the outcome. |
| next | `{ id, url, download, wait, snapshot, retry }` |
Additional information regarding the Job's current status.
Indicates whether the Job is managed by the Flatfile platform or not.
`foreground`, `background`, `toolbarBlocking`
Additional metadata for the Job. You can store any additional information here, such as the IDs of Documents or Sheets created during the execution of Job.
Please see our [API Reference](https://reference.flatfile.com/api-reference/jobs/) for details on all possible values.
## Working with Jobs
Jobs can be managed via SDK/API. For a complete list of ways to interact with Jobs please see our [API Reference](https://reference.flatfile.com) . Commonly, Jobs are acknowledged, progressed, and then completed, or failed. Here's look at those steps.
You can also use the [JobHandler Plugin](https://flatfile.com/plugins/plugin-job-handler/) which
simplifies the handling of Flatfile Jobs. This plugin works by listening to
the `job:ready` event and executing the handler callback. There is an optional
tick function which updates the Job's progress.
### Jobs.Ack
First, acknowledge a Job. This will update the Job's status to `executing`.
```jsx javascript
await api.jobs.ack(jobId, {
info: "Starting submit job...",
progress: 10,
});
```
### Jobs.Update
Once a Job is acknowledged, you can begin running your custom operation. Jobs were designed to handle large processing loads, but you can easily update your user by updating the Job with a progress value.
```jsx javascript
await api.jobs.update(jobId, {
progress: 50,
estimatedCompletionAt: new Date("Tue Aug 23 2023 16:19:42 GMT-0700"),
});
```
`Progress` is a numerical or percentage value indicating the completion status of the work.
You may also provide an `estimatedCompletionAt` value which will display your estimate of the remaining processing time in the foreground Job overlay.
Additionally, the Jobs Panel will share visibility into the estimated remaining time for acknowledged jobs.
Finally, when your Job is complete, update the Job's status to `complete`. This status update includes an optional outcome message which will be displayed to the user.
Use this to provide detail on the outcome of the Job.
### Jobs.Complete
Once a job is complete, you can display an alert to the end user using `outcome`.
```jsx javascript
await api.jobs.complete(jobId, {
outcome: {
message: `Operation was completed succesfully. ${myData.length} records were processed.`,
acknowledge: true,
},
});
```
#### Outcome.Next
Optionally, you can add a button to the outcome dialog to control the next step in the workflow once job has finished.
##### Internal Link
Add a button to the dialog that will redirect the user somewhere within a Space using `next > Id`.
In this code below, we will create a button that says "See all downloads" with this path: `space/us_sp_1234/files?mode=export`
```jsx javascript
await api.jobs.complete(jobId, {
outcome: {
message: `Operation was completed succesfully. ${myData.length} records were processed.`,
acknowledge: true,
//Reference: https://platform.flatfile.com/space/{$id}/{path}?{$query}
next: {
type: "id",
id: "dev_sp_1234",
path: "files",
query: "mode=export",
label: "See all downloads",
},
},
});
```
##### External Url
Add a button to the dialog that will redirect the user to an external link using `next > Url`.
In this code below, we will create a button that says "Go to Google". It will open in a new tab.
```jsx javascript
await api.jobs.complete(jobId, {
outcome: {
message: `Operation was completed succesfully. ${myData.length} records were processed.`,
acknowledge: true,
next: {
type: "url",
url: "http://www.google.com",
label: "Go to Google",
},
},
});
```
##### Download
Add a button to the dialog that will redirect the user to an external link using `next > Url`.
In this code below, we will create a button that says "Download this file".
```jsx javascript
await api.jobs.complete(jobId, {
outcome: {
message: `Operation was completed succesfully. ${myData.length} records were processed.`,
acknowledge: true,
next: {
type: "download",
fileName: "DownloadedFromFlatfile.csv",
url: "source_of_file.csv",
label: "Download this file",
},
},
});
```
##### Snapshot
Add a button to the dialog that will redirect the user a particular snapshot using `next > snapshot`.
In this code below, we will create a button that says "Go to Snapshot".
```jsx javascript
await api.jobs.complete(jobId, {
outcome: {
message: `Operation was completed succesfully.`,
acknowledge: true,
next: {
type: "snapshot",
label: "Go to Snapshot",
snapshotId: ${snapshot.id},
sheetId: ${sheet.id}
},
},
});
```
##### Files
Download files hosted on Flatfile by using `next > files`.
In this code below, we will create a button that says "Download files".
```jsx javascript
await api.jobs.complete(jobId, {
outcome: {
message: `The files should download automatically`,
next: {
type: "files",
label: "Download files",
files: [{fileId: "us_fl_123"}],
},
},
});
```
##### View
Dynamically update the columns visible in a sheet by using `next > view`.
In this code below, we will create a button that says "Hide columns".
```jsx javascript
await api.jobs.complete(jobId, {
outcome: {
message: `Operation was completed on sheet ${sheet.name}`,
acknowledge: true,
next: {
type: "view",
label: "Hide columns",
sheetId: ${sheet.id},
hiddenColumns: ['age', 'phone', 'middleName'] //a list of field keys on the sheet
},
},
});
```
##### Retry
Often in the event of a failure, you may want to add a button to the dialog that will retry the Job using `next > Retry`.
In this code below, we will create a button that says "Retry".
```jsx javascript
await api.jobs.complete(jobId, {
outcome: {
message: `Operation was not completed succesfully. No records were processed.`,
acknowledge: true,
next: {
type: "retry",
label: "Try again",
},
},
});
```
# Listeners
Source: https://flatfile.com/docs/learning-center/concepts/listeners
The anatomy of Listeners
## What is a Listener?
Listeners are the core of the Flatfile Platform. All of your configurations,
from [data transformations](https://flatfile.com/plugins/category/transform), to
[custom styling](/learning-center/guides/theming), to [data export](/learning-center/guides/egress) are done in
a Listener. They hold the logic backing your Flatfile Spaces, defining the
functionality of your implementation.
Fundamentally, a Listener is a function that listens for
[Events](/learning-center/concepts/events) and takes action when they occur. This means
your configurations are implemented in response to events. To understand how to
configure a Listener, it's important to understand the anatomy of an Event.
## An Event-Driven System
Flatfile is an event-driven system, meaning interactions with Flatfile (clicking
a button, editing a record, uploading a file, etc) trigger the publication of an
[Event](/learning-center/concepts/events).
Each Event contains information about its source and execution context including
a domain, topic, context, and optionally, a payload.
| Component | Description | Example |
| ------------------ | ------------------------------------------------- | ------------------------------------------------------------------ |
| Domain | the jurisdiction of the event | record, sheet, workbook, space, file, job, agent |
| Topic | a combination of a domain and an action | `workbook:created` `workbook:updated` `workbook:deleted` |
| Context | detailed information about context of the event |
|
# Anatomy of a Listener
A Listener is a function designed to receive Events as they occur and take
action by executing a callback function. For example, when a file is uploaded, a
Listener can [extract](https://flatfile.com/plugins/category/extractors) its
data to a target sheet, or when a record is created, a Listener can
[validate](/learning-center/guides/handling-data) the data.
The callback function can be as simple as logging the event or as advanced as
integrating with external systems. Here is an example that logs the topic of any
incoming event:
```jsx javascript
export default function (listener) {
listener.on("**", (event) => {
console.log(`Received event: ${event.topic}`);
});
}
```
```jsx typescript
import { FlatfileEvent, FlatfileListener } from "@flatfile/listener";
export default function flatfileEventListener(listener: FlatfileListener) {
listener.on("**", (event: FlatfileEvent) => {
console.log(`Received event: ${event.topic}`);
});
}
```
This function is listening on "\*\*", which is a wildcard that matches all
events. For a complete list of events that can be listened to, see
[Event Topics](/learning-center/concepts/events#event-topics).
### Filtering
Listeners may be notified of every event on your Environment, but the Events
that a Listener responds to are configurable. You can target which events your
code responds to in a few ways.
One way is to use a filter. A filter is a string that matches the topic of an
event. For example, the filter "commit:created" will match any event of that
topic.
A simple way to filter events is to use the shorthand syntax `listener.on`:
```jsx javascript
listener.on("commit:created", { sheet: "contacts" }, async (event) => {
// Custom action responding to the commit
});
```
```jsx typescript
listener.on(
"commit:created",
{ sheet: "contacts" },
async (event: FlatfileEvent) => {
// Custom action responding to the commit
}
);
```
This is equivalent to:
```jsx javascript
listener.filter({ sheet: "contacts" }).on("commit:created", async (event) => {
// Custom action responding to the commit
});
```
```jsx typescript
listener
.filter({ sheet: "contacts" })
.on("commit:created", async (event: FlatfileEvent) => {
// Custom action responding to the commit
});
```
For cases when you want to use multiple listeners under one filter, this syntax
may be useful:
```jsx javascript
export default function flatfileEventListener(listener) {
listener.filter({ sheet: "contacts" }, (configure) => {
configure.on("commit:created", async (event) => {
// Custom action responding to the commit
})
})
});
```
```jsx typescript
export default function flatfileEventListener(listener: FlatfileListener) {
listener.filter({ sheet: "contacts" }, (configure) => {
configure.on("commit:created", async (event) => {
// Custom action responding to the commit
})
})
});
```
### Namespaces
Namespaces are another way to organize Listener configurations and target
specific Events. If you have two different workflows, for example, you can
assign them to different namespaces to keep them separate.
To configure a listener to only act on events for a given namespace, simply use
the `listener.namespace` method.
```jsx javascript
listener.namespace(['space:green'], (listener) => { ... })
listener.namespace(['space:red'], (listener) => { ... })
```
```jsx typescript
listener.namespace(['space:green'], (listener: FlatfileListener) => { ... })
listener.namespace(['space:red'], (listener: FlatfileListener) => { ... })
```
This will scope the listener to only respond to events that match the namespace.
Learn more about namespaces in the [Namespaces](/learning-center/guides/namespaces) guide.
### Listener Types
Listeners can be hosted everywhere from your local machine to Flatfile's secure
cloud. The location of your listener determines the type of listener it is.
| Listener | Location | Description |
| ----------- | --------------------- | ------------------------------------ |
| development | developer machine | local listener for development |
| client-side | user machine | browser embedded listener |
| Agent | Flatfile secure cloud | Flatfile hosted server-side listener |
| server-side | your secure server | self-hosted server-side listener |
Flatfile Listeners can be either client-side or server-side.
Client-side listeners run in the end user's web browser and respond to user
interactions. Client-side listeners are perfect for scenarios that need to tap
into values available in the browser, such as the current user's session or the
current page URL.
Server-side listeners run where deployed, either on your own servers or on
Flatfile's secure cloud, and can perform more complex tasks like running
intensive calculations or using secured credentials to integrate with external
systems.
Server-side listeners can leverage packages that need the capabilities of a
server environment.
## Agent
In Flatfile, an Agent refers to a server-side listener bundled and deployed to
Flatfile to be hosted in our secure cloud. You may deploy multiple Agents to a
single Environment, each with its own configurations and codebase. But be
careful, as multiple Agents can interfere with each other if not properly
managed.
### Managing Multiple Agents
With the capability to deploy multiple Agents within the same Environment, it is
crucial to manage them carefully to avoid race conditions and conflicts. When
multiple Agents are listening to the same event topics, they can interfere with
each other, leading to unpredictable outcomes.
### Strategies for Conflict Avoidance
1. **Event Filtering:** Ensure each Agent is configured to listen for specific
events or topics. Properly defining the scope of events an Agent should
handle can prevent overlap and reduce the chance of race conditions.
2. **Use of Namespaces:** Utilize namespaces to segment the operational scope of
each Agent. This method ensures that Agents only respond to Events within
their designated namespaces, effectively isolating their operations and
minimizing conflict.
3. **Unique Topic Handling:** Design Agents so that no two Agents act on the
same event topic without coordination. Establish protocols for event handling
that specify which Agent should handle particular events or topics.
### Agent Configuration
When deploying an Agent, the CLI will automatically reduce the topic scope your
Agent listens to by examining your listener code and registering itself to
listen to only the topics your listener is configured to act on.
For example if your listener code only listens to `commit:created` events, the
CLI will automatically configure the Agent to only listen to `commit:created`
events. If your listener has a wildcard listener `**`, the CLI will configure
the Agent to listen to all events.
### Agent Deployment
To deploy an Agent, run the following command in your terminal from the root
directory containing your entry file:
```bash
npx flatfile@latest deploy
```
Please note that the -s flag can be used to give your Agent a unique slug. This
slug is useful when managing multiple Agents in the same Environment.
If no slug is provided and you already have a single Agent deployed, your Agent
will be updated and given a slug of `default`.
If more than one Agent is deployed to the same environment and no slug is
provided the CLI will prompt you to select which Agent you would like to update.
Please see [Deploying to Flatfile](/documentation/core-libraries/cli/deploy) for more
information on deploying an Agent.
## Agent Logs
When using Flatfile's secure cloud to host your listener, you can view the
executions of your Agent in the "Event Logs" tab of your dashboard. **Note:** If
you are running your Agent locally using the `develop` command to test, these
logs will not be recorded in your dashboard.
Event logs are useful in monitoring and troubleshooting your listener. Each
execution of your agent is recorded here, including any custom console logs you
have configured in your listener code.
### Failures
Failures occur when an Agent fails to execute properly. That means either an
uncaught error is thrown, or the execution times out.
If you are catching and handling errors within your code, those executions will
not be marked as failures. If you would prefer to see them marked this way,
re-throw your error after handling to bubble it up to the execution handler.
```jsx javascript
export default function (listener) {
listener.on("**", (event) => {
try {
// do something
} catch (error) {
// handle error
throw error; // re-throw error to mark execution as failure
}
});
}
```
```jsx typescript
export default function (listener) {
listener.on("**", (event: FlatfileEvent) => {
try {
// do something
} catch (error) {
// handle error
throw error; // re-throw error to mark execution as failure
}
});
}
```
## Learn More
Learn more about the Flatfile Platform in our other concept guides:
The building blocks that power Flatfile's event-driven system
Custom code-based operations prompted by user interactions
Handle complex or long-running processes asynchronously
Micro-applications with its own database, filestore, and auth
# Mapping
Source: https://flatfile.com/docs/learning-center/concepts/mapping
Mapping
In Flatfile, *mapping* describes the process of getting data from an uploaded
file into a destination worksheet in your application.
## The Ghost of Mapping Past
Historically mapping in Flatfile has consisted of two pieces:
1. A *field mapping* which describes which columns in the source data
corresponds to which fields in the destination worksheet. For example, maybe
the "fname" column in the source data should be mapped to the "First Name"
field in the destination worksheet.
2. For every destination field of type [enum](/learning-center/blueprint/field-types#enum), an
*enum mapping* describing which values in the source column correspond to
which allowable enum values in the destination column. For example, you could
have an enum column of country codes, and you might need the source value
"United States" to map to the enum value "US".
Flatfile has developed machine-learning models trained on millions of historical
mappings that can suggest both field and enum mappings based on the column names
and values in the source data. You can of course override these suggestions. The
Flatfile platform remembers your previously chosen mappings and will suggest
them when it sees similar schemas / data in the future.
## The Ghost of Mapping Present
Many users of Flatfile need more control over the mapping process than just
assigning source fields to destination fields. For example, you might need to
concatenate multiple source fields into a single destination field, or you might
need to choose the first non-empty value from a set of source fields.
To this end, we've recently introduced a wider variety of
[mapping rules](https://github.com/FlatFilers/flatfile-mapping/blob/main/README.md#mapping-rules)
that specify these more complex mapping-related transformations on your data,
and the notion of a *mapping program* that selectively applies these rules to
your dataset.
As of right now the only rule type supported by the Flatfile application is the
`assign` rule, as described above.
However, we have introduced a
[Python SDK](https://pypi.org/project/flatfile-mapping/) that
allows you to create your own mapping rules and mapping programs and apply them
to your own datasets outside of Flatfile (e.g. in an ETL pipeline). The SDK is
currently in extreme beta, but we would love to hear your feedback on it. It
supports the full complement of mapping rules, as described on its website.
If you have a Flatfile secret key, the SDK also exposes an "automapping" feature
that uses the previously-described machine learning models to suggest mapping
programs for a given source and destination schema. As with the Flatfile app,
this feature currently only provides `assign` rules (and also `ignore` rules
that don't do anything but just communicate that a source field was not mapped).
## The Ghost of Mapping Future
In the near future we will be expanding mapping in several ways:
### Richer Mapping Rules in the Flatfile application
We are working on adding support for the full set of mapping rules in the
Flatfile application. This will allow a user mapping an incoming file to create
complex mapping programs, save them, and reuse them.
### Richer AI Assist for Mapping Rules
Currently our machine learning algorithms only suggest "assign" rules. In the
future they will also be able to suggest more complex mapping rules; for
instance, if your incoming data had a "first name" and "last name" field, and
your destination sheet only a "full name" field, the AI might suggest a
"concatenate" rule that combines the two source fields into the destination
field.
In addition, we are working on a natural-language interface for creating mapping
rules; for instance, you could type "concatenate first name and last name into
full name" and the AI would produce the appropriate rule.
### JavaScript SDK
We are working on a JavaScript SDK that has similar functionality to the Python
SDK, the most notable difference being that it can't interact with Pandas
dataframes. It will be available soon.
# Using actions
Source: https://flatfile.com/docs/learning-center/guides/actions
Trigger operations based on user input.
An Action is a code-based operation that runs where that Action is mounted. Actions run when a user clicks the corresponding user prompt in Flatfile.
## Overview
Workbook & Sheet-mounted Actions are configured within a Blueprint object, while File-mounted actions are appended to the file during the upload process.
The executable code within an Action is compiled into a Job entity, providing the capability to run asynchronously or immediately.
Sheet-mounted actions can be executed on the entire Sheet, for a filtered view of the Sheet, or selectively for the chosen records.
## Workbook-mounted
Once a user has extracted and mapped data into a [Workbook](/learning-center/architecture/workbooks), it may be more efficient to run an operation on the entire dataset rather than making atomic transformations at the record- or field-level.
For example:
* Sending a webhook that notifies your API of the data's readiness
* Populating a Sheet with data from another source
* Adding two different fields together after a user review's initial validation checks
* Moving valid data from an editable Sheet to a read-only Sheet
Workbook-mounted actions are represented as buttons in the top right of the Workbook.

### Usage
First, configure your action on your Blueprint.
If you configure `primary: true` on an Action, it will be represented as the
rightmost button in the Workbook.
If you configure `trackChanges: true`, it will disable your actions
until all commits are complete (usually data hooks).
Next, listen for a `job:ready` and filter on the `job` you'd like to process. Be sure to complete the job when it's complete.
## Document-mounted
Document-mounted actions are similar to Workbook-mounted actions. They appear in the top right corner of your Document:

Read more about Documents [here](/learning-center/guides/documents).
### Usage
Define Document-mounted Actions using the `actions` parameter when you create a Document.
If you configure `primary: true` on an Action, it will be represented as the
rightmost button in the Document.
In your listener, listen for the job's event and perform your desired operations.
```js listener.js
export default function flatfileEventListener(listener) {
listener.on(
"job:ready",
{ job: "document:contacts:submit" },
async (event) => {
const { context, payload } = event;
const { jobId, workbookId } = context;
try {
await api.jobs.ack(jobId, {
info: "Starting submit job...",
// "progress" value must be a whole integer
progress: 10,
estimatedCompletionAt: new Date("Tue Aug 23 2023 16:19:42 GMT-0700"),
});
// Do your work here
await api.jobs.complete(jobId, {
outcome: {
message: `Submit job was completed succesfully.`,
},
});
} catch (error) {
console.log(`There was an error: ${JSON.stringify(error)}`);
await api.jobs.fail(jobId, {
outcome: {
message: `This job failed.`,
},
});
}
}
);
}
```
```js listener.ts
import { FlatfileListener, FlatfileEvent } from "@flatfile/listener";
export default function flatfileEventListener(listener: FlatfileListener) {
listener.on(
"job:ready",
{ job: "document:contacts:submit" },
async (event: FlatfileEvent) => {
const { context, payload } = event;
const { jobId, workbookId } = context;
try {
await api.jobs.ack(jobId, {
info: "Starting submit job...",
// "progress" value must be a whole integer
progress: 10,
estimatedCompletionAt: new Date("Tue Aug 23 2023 16:19:42 GMT-0700"),
});
// Do your work here
await api.jobs.complete(jobId, {
outcome: {
message: `Submit job was completed succesfully.`,
},
});
} catch (error) {
console.log(`There was an error: ${JSON.stringify(error)}`);
await api.jobs.fail(jobId, {
outcome: {
message: `This job failed.`,
},
});
}
}
);
}
```
## Sheet-mounted
Each Sheet has built-in Actions like download.
Sheet-mounted actions are represented as a dropdown in the toolbar of the Sheet.

### Usage
First, configure your action on your Blueprint.
```jsx sheet.js
sheets : [
{
name: "Sheet Name",
actions: [
{
operation: 'duplicate',
mode: 'background',
label: 'Duplicate selected names',
description: 'Duplicate names for selected rows',
primary: true,
},
{...}
]
}
]
```
Next, listen for a `job:ready` and filter on the `domain` (sheet) and the
`operation` of where the action was placed. Be sure to complete to job when
it's complete.
```jsx listener.js
listener.on(
"job:ready",
{ job: "sheet:duplicate" },
async ({ context: { jobId } }) => {
try {
await api.jobs.ack(jobId, {
info: "Getting started.",
// "progress" value must be a whole integer
progress: 10,
estimatedCompletionAt: new Date("Tue Aug 23 2023 16:19:42 GMT-0700"),
});
// Do your work here
await api.jobs.complete(jobId, {
info: "This job is now complete.",
});
} catch (error) {
console.error("Error:", error.stack);
await api.jobs.fail(jobId, {
info: "This job did not work.",
});
}
}
);
```
```jsx listener.ts
listener.on(
"job:ready",
{ job: "sheet:duplicate" },
async ({ context: { jobId } }: FlatfileEvent) => {
try {
await api.jobs.ack(jobId, {
info: "Getting started.",
// "progress" value must be a whole integer
progress: 10,
estimatedCompletionAt: new Date("Tue Aug 23 2023 16:19:42 GMT-0700"),
});
// Do your work here
await api.jobs.complete(jobId, {
info: "This job is now complete.",
});
} catch (error) {
console.error("Error:", error.stack);
await api.jobs.fail(jobId, {
info: "This job did not work.",
});
}
}
);
```
Get the example in the [getting-started](https://github.com/FlatFilers/flatfile-docs-kitchen-sink/tree/main/typescript/actions) repo
### Retrieving data
Data from the Sheet can be retrieved either by calling the API with `records.get` or through data passed in through `event.data`. Here are some examples demonstrating how you can extract data from a Sheet-mounted action:
#### From the entire Sheet
This method allows you to access and process data from the complete Sheet, regardless of the current view or selected records.
```jsx
//inside listener.on()
const { jobId, sheetId } = event.context;
//retrieve all records from sheet
const response = await api.records.get(sheetId);
//print records
console.log(response.data.records);
```
#### From a filtered view of the Sheet
By applying filters to the Sheet, you can narrow down the data based on specific criteria. This enables you to retrieve and work with a subset of records that meet the defined filter conditions.
`event.data` returns a promise resolving to an object with a records property so we extract the records property directly from the event.data object.
If rows are selected, only the corresponding records will be passed through
the event for further processing.
```jsx
//inside listener.on()
const { jobId } = event.context;
const { records } = await event.data;
try {
if (!records || records.length === 0) {
console.log("No rows were selected or in view.");
await api.jobs.fail(jobId, {
outcome: {
message: "No rows were selected or in view, please try again.",
},
});
return;
}
//print records
console.log(records);
await api.jobs.complete(jobId, {
outcome: {
message: "Records were printed to console, check it out.",
},
});
} catch (error) {
console.log(`Error: ${JSON.stringify(error, null, 2)}`);
await api.jobs.fail(jobId, {
outcome: {
message: "This action failed, check logs.",
},
});
}
```
#### For chosen records
When rows are selected, `event.data` will only extract information exclusively for the chosen records, providing focused data retrieval for targeted analysis or operations.
`event.data` returns a promise resolving to an object with a records property so we extract the records property directly from the event.data object.
This code is the same as the filtered view of the Sheet.
```jsx
//inside listener.on()
const { jobId } = event.context;
const { records } = await event.data;
try {
if (!records || records.length === 0) {
console.log("No rows were selected or in view.");
await api.jobs.fail(jobId, {
outcome: {
message: "No rows were selected or in view, please try again.",
},
});
return;
}
//print records
console.log(records);
await api.jobs.complete(jobId, {
outcome: {
message: "Records were printed to console, check it out.",
},
});
} catch (error) {
console.log(`Error: ${JSON.stringify(error, null, 2)}`);
await api.jobs.fail(jobId, {
outcome: {
message: "This action failed, check logs.",
},
});
}
```
## File-mounted
Each file has built-in actions like Delete and Download.
File-mounted actions are represented as a dropdown menu for each file in the Files list. You can attach additional actions to a File.

### Usage
First, listen for a `file:ready` event and add one or more actions to the file.
```jsx listener.js file:ready
listener.on("file:created", async ({ context: { fileId } }) => {
const file = await api.files.get(fileId);
const actions = file.data?.actions || [];
const newActions = [
...actions,
{
operation: "logFileContents",
label: "Log File Metadata",
description: "This will log the file metadata.",
},
{
operation: "decryptAction",
label: "Decrypt File",
description: "This will create a new decrypted file.",
},
];
await api.files.update(fileId, {
actions: newActions,
});
});
```
```jsx listener.ts file:ready
listener.on("file:created", async ({ context: { fileId } }: FlatfileEvent) => {
const file = await api.files.get(fileId);
const actions = file.data?.actions || [];
const newActions = [
...actions,
{
operation: "logFileContents",
label: "Log File Metadata",
description: "This will log the file metadata.",
},
{
operation: "decryptAction",
label: "Decrypt File",
description: "This will create a new decrypted file.",
},
];
await api.files.update(fileId, {
actions: newActions,
});
});
```
Next, listen for `job:ready` and filter on the `domain` (file) and the `operation` of where the Action was placed. Be sure to complete to job when it's complete.
```JavaScript listener.js job:ready
//when the button is clicked in the file dropdown
listener.on(
"job:ready",
{ job: "file:logFileContents" },
async ({ context: { fileId, jobId } }) => {
await acknowledgeJob(jobId, "Gettin started.", 10);
const file = await api.files.get(fileId);
console.log({ file });
await completeJob(jobId, "Logging file contents is complete.");
}
);
listener.on(
"job:ready",
{ job: "file:decryptAction" },
async ({ context: { spaceId, fileId, jobId, environmentId } }) => {
try {
await acknowledgeJob(jobId, "Gettin started.", 10);
const fileResponse = await api.files.get(fileId);
const buffer = await getFileBufferFromApi(fileId);
const { name, ext } = fileResponse.data;
const newFileName = name
? name.split(".")[0] + "[Decrypted]." + ext
: "DecryptedFile.csv";
const formData = new FormData();
formData.append("file", buffer, { filename: newFileName });
formData.append("spaceId", spaceId);
formData.append("environmentId", environmentId);
await uploadDecryptedFile(formData);
await completeJob(jobId, "Decrypting is now complete.");
} catch (e) {
await failJob(jobId, "The decryption job failed.");
}
}
);
async function acknowledgeJob(jobId, info, progress, estimatedCompletionAt) {
await api.jobs.ack(jobId, {
info,
progress,
estimatedCompletionAt
});
}
async function completeJob(jobId, message) {
await api.jobs.complete(jobId, {
outcome: {
message,
},
});
}
async function failJob(jobId, message) {
await api.jobs.fail(jobId, {
outcome: {
message,
},
});
}
```
```jsx listener.ts job:ready
//when the button is clicked in the file dropdown
listener.on(
"job:ready",
{ job: "file:logFileContents" },
async ({ context: { fileId, jobId } }: FlatfileEvent) => {
await acknowledgeJob(jobId, "Gettin started.", 10);
const file = await api.files.get(fileId);
console.log({ file });
await completeJob(jobId, "Logging file contents is complete.");
}
);
listener.on(
"job:ready",
{ job: "file:decryptAction" },
async ({
context: { spaceId, fileId, jobId, environmentId },
}: FlatfileEvent) => {
try {
await acknowledgeJob(jobId, "Gettin started.", 10);
const fileResponse = await api.files.get(fileId);
const buffer = await getFileBufferFromApi(fileId);
const { name, ext } = fileResponse.data;
const newFileName = name
? name.split(".")[0] + "[Decrypted]." + ext
: "DecryptedFile.csv";
const formData = new FormData();
formData.append("file", buffer, { filename: newFileName });
formData.append("spaceId", spaceId);
formData.append("environmentId", environmentId);
await uploadDecryptedFile(formData);
await completeJob(jobId, "Decrypting is now complete.");
} catch (e) {
await failJob(jobId, "The decryption job failed.");
}
}
);
async function acknowledgeJob(
jobId: string,
info: string,
progress: number,
estimatedCompletionAt: Date | undefined = undefined
) {
await api.jobs.ack(jobId, {
info,
progress,
estimatedCompletionAt,
});
}
async function completeJob(jobId: string, message: string) {
await api.jobs.complete(jobId, {
outcome: {
message,
},
});
}
async function failJob(jobId: string, message: string) {
await api.jobs.fail(jobId, {
outcome: {
message,
},
});
}
```
Get the example in the [flatfile-docs-kitchen-sink ts](https://github.com/FlatFilers/flatfile-docs-kitchen-sink/blob/main/typescript/shared/workbook_submit.ts) or [flatfile-docs-kitchen-sink js](https://github.com/FlatFilers/flatfile-docs-kitchen-sink/blob/main/javascript/shared/workbook_submit.js) repo
## Actions with Input Forms
If you configure input fields for your action, a secondary dialog will be presented to the end user, prompting them to provide the necessary information. Once the user has entered the required details, they can proceed with the action.
### Usage
First, configure your action to have an inputForm on your Blueprint. These will appear once the action button is clicked.
Next, listen for a `job:ready` and filter on the `job` you'd like to process. Grab the data entered in the form from the job itself and leverage it as required for your use case.
```js listener.js
import api from "@flatfile/api";
export default async function (listener) {
listener.on(
"job:ready",
{ job: "workbook:actionWithInput" },
async (event) => {
const { jobId } = event.context;
try {
await api.jobs.ack(jobId, {
info: "Acknowledging job",
progress: 1,
});
// retrieve input
const { data: job } = await api.jobs.get(jobId);
const input = job.input;
console.log({ input });
// do something with input...
await api.jobs.complete(jobId, {
outcome: {
message: "Action was successful",
},
});
return;
} catch (error) {
console.error(error);
await api.jobs.fail(jobId, {
outcome: {
message: "Action failed",
},
});
return;
}
}
);
}
```
```ts listener.ts
import api from "@flatfile/api";
import type { FlatfileEvent, FlatfileListener } from "@flatfile/listener";
export default async function (listener: FlatfileListener) {
listener.on(
"job:ready",
{ job: "workbook:actionWithInput" },
async (event: FlatfileEvent) => {
const { jobId } = event.context;
try {
await api.jobs.ack(jobId, {
info: "Acknowledging job",
progress: 1,
});
// retrieve input
const { data: job } = await api.jobs.get(jobId);
const input = job.input;
console.log({ input });
// do something with input...
await api.jobs.complete(jobId, {
outcome: {
message: "Action was successful",
},
});
return;
} catch (error) {
console.error(error);
await api.jobs.fail(jobId, {
outcome: {
message: "Action failed",
},
});
return;
}
}
);
}
```
## Actions with Constraints
### Usage
#### Workbook & Sheet-mounted
1. Adding a `hasAllValid` constraint on an Action will
disable a Workbook Action when there are invalid records.
2. Adding a `hasData` on an Action will
disable a Workbook Action when there are no records.
```jsx
actions: [
{
operation: 'submitActionFg',
mode: 'foreground',
label: 'Submit data elsewhere',
description: 'Submit this data to a webhook.',
primary: true,
constraints: [{ type: 'hasAllValid'},{ type: 'hasData' }]
},
{...}
],
```
#### Sheet-Mounted only
Adding a constraint of `hasSelection` on an Action will disable a Sheet Action when no records in the Sheet are selected.
```jsx sheet.js
sheets : [
{
name: "Sheet Name",
actions: [
{
operation: 'duplicate',
mode: 'background',
label: 'Duplicate selected names',
description: 'Duplicate names for selected rows',
constraints: [{ type: 'hasSelection' }],
primary: true,
},
{...}
]
}
]
```
## Actions with Messages
Add custom messages to actions, tailored according to their state:
* Error
* Info
These messages will be displayed as tooltips when users hover over an action, providing context-specific text that corresponds to the action's current state. When an error message is present on an action, the action will be disabled.
### Usage
Simply add a messages property to your action configuration. This property should be an array of objects, each specifying a message type and its content.
```jsx
actions: [
{
operation: 'duplicate',
mode: 'background',
label: 'Duplicate selected names',
description: 'Duplicate names for selected rows',
messages: [
{ type: 'error', content: 'This is an error message' },
],
primary: true,
},
{...}
]
```
## Example Project
Find the documents example in the Flatfile GitHub repository.
Clone the Actions example in Typescript
Clone the Actions example in Javascript
# Accepting Additional fields
Source: https://flatfile.com/docs/learning-center/guides/additional-fields
create additional fields on the fly
The `allowAdditionalFields` feature offers a fluid integration experience,
allowing users to effortlessly map to new or unconfigured fields in your
Blueprints.
## How it works
* By enabling `allowAdditionalFields`, your Sheet isn't restricted to the
initial configuration. It can adapt to include new fields, whether they're
anticipated or not.
* These supplementary fields can either be added through API calls or input
directly by users during the file import process.
* To ensure clarity, any field that wasn't part of the original Blueprint
configuration is flagged with a `treatment` property labeled `user-defined`.
* When adding a custom field, there's no need to fuss over naming the field. The
system intuitively adopts the header name from the imported file, streamlining
the process.
In essence, the `allowAdditionalFields` feature is designed for scalability and
ease, ensuring your Blueprints are always ready for unexpected data fields.
## Example Blueprint w/ `allowAdditionalFields`
```json
{
"sheets": [
{
"name": "Contacts",
"slug": "contacts",
"allowAdditionalFields": true,
"fields": [
{
"key": "firstName",
"label": "First Name",
"type": "string"
},
{
"key": "lastName",
"label": "Last Name",
"type": "string"
},
{
"key": "email",
"label": "Email",
"type": "string"
}
]
}
]
}
```
# Leverage AI transformations
Source: https://flatfile.com/docs/learning-center/guides/ai
use AI Assist to make bulk edits to your data efficiently and accurately
Flatfile's AI Assist enables you to transform and clean your data quickly and
effectively using natural language commands. This no code approach allows you to
easily transform your data in two simple steps:
1. Start by selecting the data you want to transform within the Flatfile
platform.
2. Use AI Assist to describe the transformations you need, and watch as your
data is instantly updated.
## AI Assist Capabilities
### Combine Data
Combine data from multiple columns into a single value. For instance, use first
name, last name, and domain name to create an email address.
Using AI Assist to create email addresses from names and domain:
```plaintext
Combine first name, last name, and domain name to create email addresses.
```
### Calculate Values
Generate new values by performing calculations on existing data. For example,
multiply property value by tax rate to calculate property taxes.
Using AI Assist to calculate property taxes:
```plaintext
Multiply property value by tax rate to calculate property taxes.
```
### Extract Data
Extract specific parts of a string to create new columns. For example, divide a
full name field into first, middle, and last names.
Using AI Assist to split full names:
```plaintext
Split full name into first name, middle name, and last name.
```
### Move and Modify Data
Move data from one column to another, change data formatting, delete values, and
more.
Using AI Assist to reformat dates:
```plaintext
Change date format from MM/DD/YYYY to YYYY-MM-DD.
```
### Language Support
Ask AI Assist to transform your data in multiple languages.
Using AI Assist in your lanaguage of choice:
```plaintext
Changez le format de date de MM/JJ/AAAA à AAAA-MM-JJ.
```
## AI Assist Features
AI Assist in data transformations offers a powerful suite of features designed
to enhance the accuracy, efficiency, and security of your data workflows. By
leveraging intelligent filtering, prompt suggestions, preview capabilities,
iterative updates, version history, and robust data privacy measures, AI Assist
ensures that your data transformations are both precise and protected. This
guide will explore each of these features in detail, demonstrating how they can
help you achieve optimal results in your data management tasks.
### Filter & Prompt
##### Use flexible filtering tools to narrow down your dataset before applying transformations.
Filter your data and apply AI Assist commands to only the selected records. This
feature allows you to target specific subsets of your data, ensuring that the
AI-driven transformations are applied precisely where needed. By selecting only
the relevant records, you can streamline the transformation process, reduce
errors, and enhance the overall efficiency of your data workflows. This targeted
approach also helps in maintaining the integrity of your data, as only the
necessary parts are altered while the rest remain untouched.
### Prompt Ideas and Tips
##### Get suggestions for prompts and tips to achieve the best results.
Get suggestions for prompts and tips to achieve the best results. AI Assist
provides intelligent recommendations for crafting effective prompts, helping
users to articulate their data transformation needs clearly. These suggestions
are tailored based on the context of your data and the desired outcome, ensuring
that you can leverage the full potential of AI-driven transformations. With
these guided prompts and tips, even users with limited experience can achieve
professional-level results, making the tool accessible and easy to use.
Use the tool's own siggestions on how to refine your prompts for better
accuracy.
### Preview Changes
##### See how your data will change based on your prompt before committing to the transformation.
Preview the changes AI Assist will make before applying them to ensure accuracy.
This feature allows you to see a detailed overview of how your data will be
transformed before any changes are committed. By previewing these modifications,
you can verify that the transformations align with your expectations and
requirements. This step acts as a safeguard against potential errors, providing
you with the confidence that the AI-driven changes will enhance your data
quality without introducing unforeseen issues.
### Iterative Prompting
##### Make adjustments to your prompts and see the updated results in real-time.
Refine your prompts and see instant updates to your data in the preview mode.
Iterative prompting enables a dynamic interaction with your data transformation
process, allowing you to make adjustments and see the results in real time. This
continuous feedback loop helps in fine-tuning the prompts to achieve the desired
outcomes more accurately. By iterating on your prompts, you can progressively
improve the transformation process, ensuring that the final results meet your
exact specifications.
### Version History
##### Restore your data to any previous state with ease.
Keep a snapshot of your data before each change, allowing you to revert to
previous versions if needed. Version history is a crucial feature that provides
a safety net for your data transformations. By maintaining a record of each
state of your data, you can easily backtrack to any previous version if the
current changes do not yield the expected results. This capability ensures that
your data is always recoverable, giving you peace of mind and flexibility in
managing your transformations.
### Data Privacy
##### Your data is not shared with third-party AI providers.
AI Assist ensures your data remains secure and private, with all processing done
within Flatfile’s infrastructure. Data privacy is a paramount concern, and AI
Assist addresses this by guaranteeing that all data processing occurs within a
secure and controlled environment. By keeping all operations within Flatfile’s
infrastructure, the risk of data breaches or unauthorized access is minimized.
This commitment to privacy ensures that your data remains confidential and
protected throughout the transformation process.
AI Assist makes bulk data editing quick, accurate, and effortless for teams.
The feature is available now for Pro and Enterprise customers. Please contact
us to enable it on your account.
# Automate with automap
Source: https://flatfile.com/docs/learning-center/guides/automap
automatically map your data fields
### Usage
A key component of Flatfile's headless automation workflow is the automapping
functionality provided by our
[Automap Plugin](https://flatfile.com/plugins/plugin-automap/).
Upon uploading a file, after file extraction is complete, the data needs to be
mapped to a destination Workbook. In a non-headless environment a human might do
this manually. But in a headless environment, we'll automate this process.
We'll use Automap to do this for us.
### Install
First, add the plugin to your project.
### Configure
Next, configure the plugin. Choose an accuracy level, a default target sheet,
and a regex expression to test incoming files.
**accuracy:** either `confident` or `exact` \
**defaultTargetSheet:** the name of the sheet you want to map to \
**matchFilename:** regex that will match on incoming files
Note you may also set `debug` to true for useful error messages in
development.
### Deploy and watch it work
Add your configured listener to your project and deploy it to Flatfile.
# Data Retention
Source: https://flatfile.com/docs/learning-center/guides/data-retention
implementing Data Retention Policies in Flatfile
**Curious about activating Data Retention?** Talk with one of our product
experts to discover how Flatfile's Data Retention Policy can help you control
the lifecycle of your data within Flatfile on our
**[Enterprise](https://flatfile.com/pricing/)** plan. [Connect with a product
expert.](https://flatfile.com/contact-sales/consult/)
## Introduction
Data retention policies are crucial for managing and controlling the lifecycle
of data within your Flatfile account. In this guide, we'll walk you through the
process of adding and configuring data retention policies based on our two
options `last activity` and `since created` policy types. These policies will
help you automatically clean up outdated data within your Flatfile environment,
ensuring optimal data hygiene.
Flatfile will expire Spaces (the Space will no longer be accessible but will
still appear in the Spaces' response) and delete any data that was uploaded or
added to the respective Space.
Flatfile has a 7 day policy which means that your data may be deleted +/- 7
days from your defined period.
## Prerequisites
Before you start implementing data retention policies, make sure you have:
1. A Flatfile account with administrative privileges.
2. Familiarity with your data and the desired retention periods.
## Accessing the Dashboard
1. Log in to your Flatfile account.
2. Navigate to the dashboard, where you can manage and configure your Spaces.
## Understanding Data Deletion Logic
Let's clarify the logic behind data deletion based on the example you provided:
### Since Created Policy Example:
* Space created on January 1st.
* Policy set to 2 days.
* If Flatfile's cron event runs on January 2nd, any Space created before January
1st (older than 2 days) will be expired. Any data that was uploaded to the
Spaces will be deleted.
### Last Activity Policy Example:
* Space last modified on January 1st.
* Policy set to 3 days.
* If Flatfile's cron event runs on January 4th, any Space that hasn't been
modified since January 1st (older than 3 days) will be expired. Any data that
was uploaded to the Spaces will be deleted.
## Monitoring and Adjusting Policies
Regularly monitor the results of the data retention policies to ensure they
align with your data management goals. If needed, adjust the policy time periods
based on your evolving data requirements.
## Implementing via api
If you'd like to create, update, delete your policy via API, check out our
[API reference](https://reference.flatfile.com/)
# Data Clips (Beta)
Source: https://flatfile.com/docs/learning-center/guides/dataclips
isolate, edit, collaborate-on, and re-integrate subsets of Sheet data
Data Clips is a powerful, built-in feature in Flatfile that makes data collaboration and issue resolution fast, secure, and incredibly easy.
Data Clips is currently in Beta and may have some limitations and unexpected behavior.
## What are Data Clips?
Data Clips are subsets of Sheet data where users can edit and collaborate without affecting the original dataset. Data Clips were developed to enable teams to:
* Create isolated working environments for data modifications
* Enable parallel workflows across different teams
* Maintain data integrity through controlled update processes
* Share specific data segments with collaborators
* Implement robust conflict resolution mechanisms
## Creating A Data Clip
To create a Data Clip, go to the source Sheet and select which Records you would like to work on.
Use the built in sheet action to create the clip, giving it a title, optional comment, and collaborators.
Adding Records to Data Clips is allowed only when the Data Clip is in an Open and Ready state.
See [Data Clip States](#reference) for more information.
## Using Data Clips
### Collaborating
Data Clips are a collaborative feature. Any guest of your Space may be invited to collaborate on a Data Clip.
Once added, collaborators can edit, leave comments, and review changes before the Data Clip is merged back into the source Sheet.
### Adding Records
Record management within Data Clips is designed to be flexible and intuitive.
Users can add records to existing clips as needed. From the source Sheet, selecting the Records you wish to add, then select the `Add Records to Data Clip` Sheet action.
### Deleting Records
Users may flag records for deletion in a Data Clip. While working in the clip, these records will appear with a strikethrough to show they'll be deleted.
You can restore marked records at any time before merging. After merging, these records will be removed from the main sheet.
### Conflict Resolution
If the same record has been modified in both the Data Clip and the main sheet, a conflict
resolution interface ensures that users can make an informed decision on which version to keep.
### Merging Changes
Once changes are finalized, they can be merged back into the main sheet, ensuring controlled updates while preserving the original dataset's integrity.
When merging a Data Clip, a visual diff of all changes allows users to confirm updates before they are applied.
Only the creator of a Data Clip has the authority to merge changes back into the main sheet, maintaining control over the final updates.
## Data Clip Library
Multiple Data Clips can be created from a single sheet, enabling different teams or individuals to work in parallel without interfering with each other's workflows.
The Data Clip Library provides an organized way to find and manage Data Clips. Users can search for Data Clips by name, creator or source SheetId. The library also supports filtering by status, making it easy to locate specific clips.
## Reference
| State | Description |
| ---------- | ------------------------------------------------------------------------------------ |
| **Open** | Newly created, allows record addition/removal, enables data modifications |
| **Ready** | Changes are locked, conflicts are identified, available for review/resolution |
| **Merged** | Changes are accepted, data is merged back to source, historical record is maintained |
For detailed API documentation, please refer to our [API Reference](https://reference.flatfile.com).\
To learn about the events emitted for Data Clips, refer to [Events](/learning-center/concepts/events#data-clips).
## Troubleshooting
If you encounter issues with Data Clips, here are some common problems and their solutions.
| Issue | Solution |
| ------------------------- | ---------------------------------------------------------------------------------------------- |
| Creation Failures | Check sheet permissions, and verify record selection |
| Merge Conflicts | Refresh source data, review all conflicts systematically, and resolve conflicts field by field |
| Unable to merge Data Clip | Ensure you are the creator of the Data Clip |
# Build Documents
Source: https://flatfile.com/docs/learning-center/guides/documents
Learn how to build pages in your sidebar with Documents.
Documents are standalone webpages for your Flatfile [Spaces](/learning-center/architecture/spaces). They can be rendered from [Markdown syntax](https://www.markdownguide.org/basic-syntax/).
Often used for getting started guides, Documents become extremely powerful with dynamically generated content that stays updated as Events occur.
Flatfile also allows you to use HTML tags in your Markdown-formatted text.
This is helpful if you prefer certain HTML tags rather than Markdown syntax.
## Create a Document
You can create Documents using the API:
This Document will now appear in the sidebar of your Space.
In this example, we create a Document when a file is uploaded, but you can also create Documents in response to any other Event. [Read more](/learning-center/concepts/events) about the different Events you can respond to.
## Document Actions
Actions are optional and allow you to run custom operations in response to a user-triggered event from within a Document.
Define Actions on a Document using the `actions` parameter when a document is created:
Then configure your listener to handle this Action, and define what should happen in response. Read more about Actions and how to handle them [here](./actions#document-mounted).
Actions appear as buttons in the top right corner of your Document:

## Document treatments
Documents have an optional `treatments` parameter which takes an array of treatments for your Document. Treatments can be used to categorize your Document. Certain treatments will cause your Document to look or behave differently.
### Ephemeral documents
Giving your Document a treatment of `"ephemeral"` will cause the Document to appear as a full-screen takeover, and it will not appear in the sidebar of your Space like other Documents. You can use ephemeral Documents to create a more focused experience for your end users.
```ts
const ephemeralDoc = await api.documents.create(spaceId, {
title: 'Getting started',
body: '# Welcome ...',
treatments: ['ephemeral'],
})
```
Currently, `"ephemeral"` is the only treatment that will change the behavior
of your Document. Other treatments will not affect how your Document behaves,
but can be used for your own purposes to categorize and describe your
Documents in code.
## Adding Blocks to Documents
Blocks are dynamic, embedded entities that you can use to display data inside a Document. You can add a Block to a Document using the `