AetherPackWriter is the primary class for creating APACK archives. It supports adding entries from streams, files, and byte arrays with optional compression and encryption.
public final class AetherPackWriter implements CloseableCreates a writer to an output stream with default settings.
public static AetherPackWriter create(OutputStream output)Parameters:
output- The output stream to write to
Returns: A new writer instance
Note: Stream-based writers cannot update the file header after closing, limiting random access support.
Example:
try (AetherPackWriter writer = AetherPackWriter.create(outputStream)) {
writer.addEntry("file.txt", data);
}Creates a writer with custom configuration.
public static AetherPackWriter create(OutputStream output, ApackConfiguration config)Parameters:
output- The output streamconfig- Configuration settings
Creates a writer to a file with default settings.
public static AetherPackWriter create(Path path) throws IOExceptionParameters:
path- File path to create
Returns: A new writer instance
Throws:
IOException- If file cannot be createdSecurityException- If access is denied
Note: File-based writers can update the header with entry count and trailer offset, enabling full random access.
Example:
try (AetherPackWriter writer = AetherPackWriter.create(Path.of("archive.apack"))) {
writer.addEntry("readme.txt", "Hello, World!".getBytes());
}Creates a writer to a file with custom configuration.
public static AetherPackWriter create(Path path, ApackConfiguration config)
throws IOExceptionParameters:
path- File path to createconfig- Configuration settings
Returns: A new writer instance
Example:
ApackConfiguration config = ApackConfiguration.builder()
.compression(CompressionRegistry.zstd(), 6)
.chunkSize(128 * 1024)
.build();
try (AetherPackWriter writer = AetherPackWriter.create(path, config)) {
writer.addEntry("large-file.dat", inputStream);
}Adds an entry from an input stream.
public void addEntry(String name, InputStream input) throws IOExceptionParameters:
name- Entry name (path within archive)input- Data source (read until EOF)
Throws: IOException if I/O error or writer is closed
Note: The input stream is NOT closed by this method.
Example:
try (InputStream fileInput = Files.newInputStream(sourcePath)) {
writer.addEntry("data/file.bin", fileInput);
}Adds an entry with full metadata control.
public void addEntry(EntryMetadata metadata, InputStream input) throws IOExceptionParameters:
metadata- Entry metadata including name, MIME type, attributesinput- Data source
Example:
EntryMetadata metadata = EntryMetadata.builder()
.name("document.pdf")
.mimeType("application/pdf")
.attribute("author", "John Doe")
.attribute("created", System.currentTimeMillis())
.build();
try (InputStream pdfStream = Files.newInputStream(pdfPath)) {
writer.addEntry(metadata, pdfStream);
}Adds an entry from a file.
public void addEntry(String name, Path path) throws IOExceptionParameters:
name- Entry name (can differ from file name)path- Source file path
Throws:
IOException- If file cannot be readNoSuchFileException- If file doesn't exist
Example:
// Store with different name
writer.addEntry("config/settings.json", Path.of("/etc/myapp/config.json"));
// Store with same name
Path file = Path.of("data.bin");
writer.addEntry(file.getFileName().toString(), file);Adds an entry from a byte array.
public void addEntry(String name, byte[] data) throws IOExceptionParameters:
name- Entry namedata- Entry content
Example:
String json = "{\"version\": 1}";
writer.addEntry("metadata.json", json.getBytes(StandardCharsets.UTF_8));Returns the number of entries written so far.
public int getEntryCount()Returns: Entry count (≥ 0)
Closes the writer and finalizes the archive.
public void close() throws IOExceptionOperations performed:
- Writes file header (if not already written)
- Writes trailer with table of contents
- Flushes buffered data
- Updates file header (if writing to file)
- Closes underlying stream
Important:
- Failure to close results in an incomplete archive
- Method is idempotent (safe to call multiple times)
- Empty archives are valid (header + trailer only)
ApackConfiguration config = ApackConfiguration.builder()
.compression(CompressionRegistry.zstd(), 6) // ZSTD level 6
.build();
try (AetherPackWriter writer = AetherPackWriter.create(path, config)) {
writer.addEntry("data.bin", largeInputStream);
}// Generate encryption key
EncryptionProvider aes = EncryptionRegistry.aes256Gcm();
SecretKey key = aes.generateKey();
ApackConfiguration config = ApackConfiguration.builder()
.encryption(aes, key)
.build();
try (AetherPackWriter writer = AetherPackWriter.create(path, config)) {
writer.addEntry("secrets.dat", sensitiveData);
}ApackConfiguration config = ApackConfiguration.builder()
.compression(CompressionRegistry.zstd(), 3)
.encryption(EncryptionRegistry.aes256Gcm(), secretKey)
.chunkSize(64 * 1024) // 64 KB chunks
.build();
try (AetherPackWriter writer = AetherPackWriter.create(path, config)) {
// Data is compressed, then encrypted
writer.addEntry("data.bin", inputStream);
}// Derive key from password using Argon2id
char[] password = getPassword();
byte[] salt = generateSalt(32);
byte[] keyBytes = Argon2id.derive(
password, salt,
3, // time cost
65536, // memory (64 MB)
4, // parallelism
32 // key length
);
SecretKey dek = new SecretKeySpec(keyBytes, "AES");
// Create encryption block with KDF parameters
EncryptionBlock encBlock = EncryptionBlock.builder()
.kdfAlgorithmId(FormatConstants.KDF_ARGON2ID)
.cipherAlgorithmId(FormatConstants.ENCRYPTION_AES_256_GCM)
.kdfIterations(3)
.kdfMemory(65536)
.kdfParallelism(4)
.salt(salt)
.wrappedKey(wrapKey(dek)) // Encrypt DEK with KEK
.wrappedKeyTag(tag)
.build();
ApackConfiguration config = ApackConfiguration.builder()
.encryption(EncryptionRegistry.aes256Gcm(), dek, encBlock)
.build();
try (AetherPackWriter writer = AetherPackWriter.create(path, config)) {
writer.addEntry("data.bin", inputStream);
}ApackConfiguration config = ApackConfiguration.builder()
.chunkSize(1024 * 1024) // 1 MB chunks
.compression(CompressionRegistry.zstd(), 9) // Higher compression
.build();ApackConfiguration config = ApackConfiguration.builder()
.streamMode(true)
.compression(CompressionRegistry.lz4())
.build();
try (AetherPackWriter writer = AetherPackWriter.create(pipeOutputStream, config)) {
writer.addEntry("stream.dat", inputStream);
}public void createArchive(Path archivePath, Path sourceDir) throws IOException {
// Configure compression
ApackConfiguration config = ApackConfiguration.builder()
.compression(CompressionRegistry.zstd(), 6)
.chunkSize(256 * 1024)
.build();
try (AetherPackWriter writer = AetherPackWriter.create(archivePath, config)) {
// Walk directory and add files
Files.walk(sourceDir)
.filter(Files::isRegularFile)
.forEach(file -> {
try {
// Compute relative path for archive entry name
String entryName = sourceDir.relativize(file)
.toString()
.replace('\\', '/'); // Normalize separators
// Detect MIME type
String mimeType = Files.probeContentType(file);
// Build metadata
EntryMetadata metadata = EntryMetadata.builder()
.name(entryName)
.mimeType(mimeType != null ? mimeType : "application/octet-stream")
.attribute("lastModified", Files.getLastModifiedTime(file).toMillis())
.build();
// Add entry
try (InputStream input = Files.newInputStream(file)) {
writer.addEntry(metadata, input);
}
System.out.println("Added: " + entryName);
} catch (IOException e) {
throw new UncheckedIOException(e);
}
});
System.out.println("Total entries: " + writer.getEntryCount());
}
}- Always use try-with-resources to ensure proper closing
- Choose appropriate chunk size:
- Smaller (16-64 KB) for random access
- Larger (256 KB - 1 MB) for better compression
- Match compression level to use case:
- Low levels (1-3) for speed
- High levels (6-9) for ratio
- Use file paths when possible for full random-access support
- Close input streams after calling
addEntry()(not closed automatically)
Next: Configuration | Previous: Reader API