diff --git a/.gitignore b/.gitignore index 6be37a8..54eb4a8 100644 --- a/.gitignore +++ b/.gitignore @@ -1,9 +1,11 @@ .venv __pycache__ +**/*.parquet checkpoints/ logs/ .DS_Store .env .pytest_cache/ .ruff_cache/ -.cursor/ \ No newline at end of file +.cursor/ +**/*egg-info* diff --git a/README.md b/README.md index e8d3a6f..486f7ce 100644 --- a/README.md +++ b/README.md @@ -1,23 +1,95 @@ -# HOLO-PASWIN: In-Line Holographical Physics-Aware SWIN Transformer +
-This repository includes pre-commit hooks that automatically run code quality checks (ruff and mypy) before each commit. To install them:
+*HoloPASWIN recovers clean phase and amplitude mappings from a single intensity hologram, directly eliminating twin-image artifacts.*
+
+
+HoloPASWIN utilizes a U-shaped architecture based on Swin Transformer blocks. The model first processes an input intensity hologram using the backward Angular Spectrum Method (ASM) to obtain an initial, artifact-heavy complex field. A 4-stage Swin Encoder-Decoder network then extracts multi-scale features to predict a residual correction. By adding this correction to the initial field and training with both frequency-domain constraints and a physics-based forward propagation loss, the network robustly recovers the clean phase and amplitude.
+
+
+## Installation
+
+This project uses [uv](https://github.com/astral-sh/uv) for fast and reliable dependency management.
+
+1. **Install uv** (if not already installed):
+ ```bash
+ curl -LsSf https://astral.sh/uv/install.sh | sh
+ ```
+
+2. **Sync Dependencies**:
+ Navigate to the `holopaswin` directory and run:
+ ```bash
+ uv sync
+ ```
+ This creates a virtual environment and installs all locked dependencies from `uv.lock`.
+
+## Usage
+
+### Training
+
+To train the model from scratch on the dataset, run:
+
+```bash
+uv run src/train.py
+```
+
+### Inference & Testing
+
+To evaluate a trained model and generate visualization results on test samples, run:
+
+```bash
+uv run src/inference.py
+```
+
+To calculate full quantitative metrics (SSIM, PSNR, MSE) over the test dataset, run:
+
+```bash
+uv run src/test.py
+```
+
+### Development
+
+This repository includes pre-commit hooks that automatically run code quality checks (`ruff` and `mypy`) before each commit. To install them:
```bash
./scripts/install-hooks.sh
```
-This will set up the hooks from `.githooks/` to `.git/hooks/`. The hooks will:
-- Run `ruff check` on `src/`
-- Run `ruff format --check` on `src/`
-- Run `mypy --strict` on `src/`
+If any check fails, the commit will be blocked.
+
+## Citation
-If any check fails, the commit will be blocked. You can bypass the hook with `git commit --no-verify` (not recommended).
+If you find this code, dataset, or model useful for your research, please cite our paper:
+```bibtex
+@misc{kocmarli2026holopaswinrobustinlineholographic,
+ title={HoloPASWIN: Robust Inline Holographic Reconstruction via Physics-Aware Swin Transformers},
+ author={Gökhan Koçmarlı and G. Bora Esmer},
+ year={2026},
+ eprint={2603.04926},
+ archivePrefix={arXiv},
+ primaryClass={eess.IV},
+ url={https://arxiv.org/abs/2603.04926},
+}
+```
diff --git a/docs/images/architecture.png b/docs/images/architecture.png
new file mode 100644
index 0000000..1569b4f
Binary files /dev/null and b/docs/images/architecture.png differ
diff --git a/docs/images/comparison.png b/docs/images/comparison.png
new file mode 100644
index 0000000..90f055e
Binary files /dev/null and b/docs/images/comparison.png differ
diff --git a/docs/images/detailed_comparison.png b/docs/images/detailed_comparison.png
new file mode 100644
index 0000000..ab500d8
Binary files /dev/null and b/docs/images/detailed_comparison.png differ
diff --git a/docs/images/z_mismatch_plot.png b/docs/images/z_mismatch_plot.png
new file mode 100644
index 0000000..5ede695
Binary files /dev/null and b/docs/images/z_mismatch_plot.png differ
diff --git a/docs/index.html b/docs/index.html
new file mode 100644
index 0000000..91788c4
--- /dev/null
+++ b/docs/index.html
@@ -0,0 +1,310 @@
+
+
+
+
+
+
+
+
+ + In-line digital holography (DIH) is a widely used lensless imaging technique, valued for its + simplicity and capability to image samples at high throughput. However, capturing only + intensity of the interference pattern during the recording process gives rise to some + unwanted terms such as cross-term and twin-image. The cross-term can be suppressed by + adjusting the intensity of the reference wave, but the twin-image problem remains. The + twin-image is a spectral artifact that superimposes a defocused conjugate wave onto the + reconstructed object, severely degrading image quality. +
++ While deep learning has recently emerged as a powerful tool for phase retrieval, traditional + Convolutional Neural Networks (CNNs) are limited by their local receptive fields, making + them less effective at capturing the global diffraction patterns inherent in holography. In + this study, we introduce HoloPASWIN, a physics-aware deep learning framework based on + the Swin Transformer architecture. By leveraging hierarchical shifted-window attention, our + model efficiently captures both local details and long-range dependencies essential for + accurate holographic reconstruction. +
++ We propose a comprehensive loss function that integrates frequency-domain constraints with + physical consistency via a differentiable angular spectrum propagator, ensuring high + spectral fidelity. Validated on a large-scale synthetic dataset of 25,000 samples with + diverse noise configurations, HoloPASWIN demonstrates effective twin-image suppression and + robust reconstruction quality. +
++ HoloPASWIN implements a U-shaped encoder-decoder architecture but replaces standard + convolutional layers with Swin Transformer blocks to capture long-range diffraction + dependencies. +
++ We constrain the network across spatial, frequency, and physical measurement domains. The + total loss combines a supervised L1 loss on amplitude, phase, and the complex field, + together with a critical frequency loss ($\mathcal{L}_{freq}$) that prevents the + smoothing of high-frequency details. +
++ Additionally, an unsupervised physics loss ($\mathcal{L}_{phy}$) enforces forward + imaging consistency. It propagates the predicted object field forward using a differentiable + ASM layer and minimizes its distance against the original intensity hologram, mathematically + penalizing any physically unnatural twin-image remnants. +
++ Quantitative evaluations underscore the superiority of our framework on a demanding + synthetic dataset. HoloPASWIN achieves an exceptional Phase SSIM of 0.974 and a + Phase PSNR of 44.3 dB, showcasing robust phase recovery crucial for QPI applications. +
+
+ + Qualitative comparison of reconstruction results. Rows correspond to different test + samples. From left to right: Input Hologram, GT Amplitude, Predicted Amplitude, GT Phase, + Predicted Phase. The model effectively removes twin-images and background noise while preserving + object sharpness. +
++ Detailed Error Analysis. Close-up regions emphasize the model's preservation + of fine structural details and the corresponding low magnitudes in the error maps. + Differences are mostly confined to high-frequency edges. +
+
+ + Distance Robustness. The model is exceptionally accurate when tested at the + training calibration distance (z=20 mm), highlighting the highly geometry-specific + nature of deep phase retrieval mappings. Performance degrades with mismatches $\ge 0.5$ mm. +
+
+ @misc{koçmarlı2026holopaswinrobustinlineholographic,
+ title={HoloPASWIN: Robust Inline Holographic Reconstruction via Physics-Aware Swin Transformers},
+ author={Gökhan Koçmarlı and G. Bora Esmer},
+ year={2026},
+ eprint={2603.04926},
+ archivePrefix={arXiv},
+ primaryClass={eess.IV},
+ url={https://arxiv.org/abs/2603.04926},
+}
+