Skip to main content
Data

Data Processing

Describe the transformation you need in plain English. The agent writes and runs the Python script, shell pipeline, or Node job — you just approve and get clean output.

Video demo coming soon

Impact

What changes

Without Lapu AI

A data analyst spends 20-40 minutes writing a Python script or fumbling with spreadsheet formulas to clean and transform a CSV. Sensitive data gets uploaded to online tools for quick fixes.

With Lapu AI

Lapu AI generates and runs the transformation script in minutes. The original data is preserved, the output is validated, and processing happens locally on your machine.

20-40 minutes per transformationsaved per task

The challenge

Processing local data files often requires writing one-off scripts, switching between spreadsheet applications, or uploading sensitive data to online tools. Teams working with customer data, financial records, or proprietary datasets need a way to transform and analyze files without sending them to third-party services.



How Lapu AI solves this

Lapu AI reads your local data files (CSV, JSON, TSV, XML, YAML, logs), understands the structure, and performs transformations you describe in plain language. It writes and executes processing scripts — either via shell pipelines (awk, jq, sed) or through sandboxed Python and Node runtimes with automatic package management. The sandbox execution environment runs scripts locally with output capture and validation.

Data processing runs locally via scripts the agent generates. Only structural context (like column names) is sent to the AI model.

Workflow

How it works

1

Load your data files

Point Lapu AI at your data files. The agent reads the first rows using File Read to understand the schema, column types, and overall structure.

File ReadShell
2

Describe what you need

Tell the agent what to do in plain language — filter, merge, aggregate, reformat. The agent writes the appropriate script (Python, Node, or shell) and shows it to you before execution.

File EditSandbox Execute
3

Run, validate, and export

Approve the script and the agent runs it — either through sandboxed Python/Node execution or shell pipelines. It reads the output, reports row counts, shows a sample, and flags anomalies.

Sandbox ExecuteShellFile Read

Try it yourself

What you would type

Copy any of these into Lapu AI to get started immediately.

>Read the CSV at ~/reports/sales-q1.csv, remove all rows where the amount is zero, and save the cleaned result as a new file.

>Merge all JSON files in the data/ folder into a single CSV, using the 'id' field as the primary key.

>Parse the nginx access log and generate a summary showing the top 20 URLs by request count with a Python script.

Ready to try this workflow?

Download Lapu AI and run it on your own machine. Free to start.

Download for free

FAQ

Common questions


What data formats does Lapu AI support?

Lapu AI can read any text-based format including CSV, TSV, JSON, JSONL, XML, YAML, and plain text logs. For binary formats like Excel (.xlsx), it uses the built-in XLSX skill or command-line tools.

Is my data sent to the cloud for processing?

No. The data processing itself happens locally — via shell commands or sandboxed Python/Node scripts running on your machine. Only structural context (like column names and sample rows) is sent to the AI model to generate the processing logic.

What is sandbox execution?

Lapu AI can run Python and Node scripts in a sandboxed environment on your machine. It automatically installs approved packages if needed, captures output (up to 60KB), and enforces timeouts. This is safer than running raw shell commands for complex data transformations.

Put your busywork on autopilot

Lapu AI handles the repetitive work between you and outcomes. One desktop agent, zero tab-switching. Available now on macOS and Windows.

Create a free account. Download in under a minute.

Lapu AI Agent Chat interface with conversation history and workflow suggestions
Local Data Processing — Lapu AI Desktop Agent — Lapu AI