@smooai/file
Stream-first file handling library. Unified API for reading files from local paths, URLs, S3, and FormData. Automatic file type detection and streaming by default.
Overview
@smooai/file provides a unified interface for working with files from any source. It handles file bytes lazily where possible to minimize memory usage, so you can process large files without loading them entirely into memory.
Stream-First
Lazy loading of file contents with automatic stream handling. Supports both Node.js and Web streams.
Multiple Sources
Read from local filesystem, URLs, S3 buckets, and FormData uploads with a single consistent API.
Type Detection
Automatic MIME type detection via magic numbers, supporting over 100 file types out of the box.
Features
Rich Metadata
Access file name, extension, MIME type, size, last modified date, creation date, hash, URL, and source type.
Pipe-Friendly Streaming
Stream files between sources without buffering. Pipe from URL to S3, local to S3, or any combination.
S3 Integration
Direct upload, download, save, move operations with S3. Generate signed URLs for temporary access.
TypeScript-First
Full type safety with typed metadata, source-specific methods, and IntelliSense-friendly APIs.
Installation
pnpm add @smooai/fileQuick Start
Create a file instance from any source and work with it using a unified API. File contents are streamed lazily by default.
import { SmooFile } from '@smooai/file';
// From URL
const file = await SmooFile.fromUrl('https://example.com/document.pdf');
console.log(file.mimeType); // 'application/pdf'
console.log(file.size); // 1234567
// Stream to S3
await file.pipeTo(s3Destination);File Sources
@smooai/file supports four different sources for creating file instances. Each source provides the same unified API for reading, streaming, and metadata access.
From URL
Download files from any URL with automatic stream-based transfer and header metadata extraction.
import File from '@smooai/file';
const file = await File.createFromUrl('https://example.com/large-file.zip');
// Pipe to a destination (streams without loading entire file)
await file.pipeTo(someWritableStream);
// Read as bytes (streams in chunks)
const bytes = await file.readFileBytes();
// Save to filesystem (streams directly)
const { original, newFile } = await file.saveToFile('downloads/file.zip');From Local Path
Read files from the local filesystem with full metadata extraction and streaming support.
const local = await File.createFromFile('path/to/file.csv');
// Read file contents
const content = await local.readFileString();
// Get file metadata
console.log(local.metadata);
// { name: 'file.csv', mimeType: 'text/csv', size: 1234, extension: 'csv', ... }From S3
Direct S3 integration with stream-based transfer, header metadata extraction, and signed URL generation.
// Create from S3 (streams automatically)
const s3File = await File.createFromS3('my-bucket', 'path/to/file.jpg');
// Upload to S3 (streams directly)
await s3File.uploadToS3('my-bucket', 'remote/file.jpg');
// Save to S3 (creates new file instance)
const { original, newFile } = await s3File.saveToS3('my-bucket', 'remote/file.jpg');
// Generate signed URL for temporary access
const signedUrl = await s3File.getSignedUrl(3600); // Expires in 1 hourFrom FormData
Handle multipart file uploads with ease. Create file instances from FormData for processing.
const file = await File.createFromFile('document.pdf');
// Convert to FormData for uploads
const formData = await file.toFormData('document');
// Use with fetch or other HTTP clients
await fetch('https://api.example.com/upload', {
method: 'POST',
body: formData,
});File Type Detection
File types are automatically detected using magic number analysis, providing accurate MIME type identification regardless of file extension. Supports over 100 file types out of the box.
import File from '@smooai/file';
const file = await File.createFromFile('document.xml');
// Get file type information (detected via magic numbers)
console.log(file.mimeType); // 'application/xml'
console.log(file.extension); // 'xml'
// Detection sources (in priority order):
// 1. Magic numbers (via file-type library)
// 2. MIME type headers (for URL and S3 sources)
// 3. File extension
// 4. Custom detectorsStreaming
@smooai/file is built around streaming. Files are never fully loaded into memory unless you explicitly request it. This makes it safe to work with files of any size.
import File from '@smooai/file';
// Download from URL and upload to S3 without buffering
const file = await File.createFromUrl('https://example.com/large-dataset.csv');
await file.uploadToS3('data-bucket', 'imports/dataset.csv');
// Move files between S3 buckets (streams directly)
const s3File = await File.createFromS3('source-bucket', 'file.pdf');
const moved = await s3File.moveToS3('dest-bucket', 'archived/file.pdf');Reading Methods
| Method | Description |
|---|---|
| readFileString() | Read entire file as a UTF-8 string |
| readFileBytes() | Read entire file as a byte array |
| pipeTo(writable) | Stream file contents to a writable destination |
| saveToFile(path) | Stream file to local filesystem |
| uploadToS3(bucket, key) | Stream file directly to S3 |
Real-World Example: File Upload Pipeline
A complete example showing how to accept a file upload, validate its type, and store it in S3 with metadata.
import File from '@smooai/file';
async function handleUpload(formData: FormData) {
// Create file from upload
const file = await File.createFromFile(formData.get('document') as string);
// Validate file type
const allowedTypes = ['application/pdf', 'image/png', 'image/jpeg'];
if (!allowedTypes.includes(file.mimeType)) {
throw new Error(`Unsupported file type: ${file.mimeType}`);
}
// Upload to S3 with organized path
const key = `uploads/${Date.now()}/${file.metadata.name}`;
await file.uploadToS3('documents-bucket', key);
// Generate a signed URL for immediate access
const signedUrl = await file.getSignedUrl(3600);
return { key, signedUrl, mimeType: file.mimeType, size: file.metadata.size };
}