Ptychography is a lenseless imaging technique which became popular among the practitioners in last two decades. It considers a series of illuminations of the object of interest, where at a time a small region of the specimen is illuminated and the resulting diffraction pattern is captured by the detector in the far-field. As the regions overlap, this provides a surplus information and allows for a recovery from the collection of the observed diffraction patterns.
The rising popularity of ptychography sparked the rapid development of reconstruction algorithms. Furthermore, many of the well-known and established methods for phase retrieval were adapted for ptychography since it can be viewed as a special case of the phase retrieval problem. With the large number of algorithms present, it is inevitable that some of the techniques share similarities and the analysis for one approach can be used to study another.
In this talk we consider three of the methods present in the literature. The first is the gradient descend technique for amplitude-based squared loss known as Amplitude Flow. The second is the Error Reduction algorithm, which is the alternating projections approach. The last is the Ptychographic Iterative Engine (PIE), a computationally fast method utilizing a single diffraction pattern at the time. We show that the later two algorithms can also be viewed as the gradient methods for the same amplitude-based squared loss function. More precisely, we show that Error Reduction performs the scaled gradient descent and PIE is nothing else but the stochastic gradient descent. Based on the convergence theory for Amplitude Flow, we further establish the guaranteed convergence of both algorithms and show that the convergence speed is sublinear. We also discuss the implications of the algorithms being the gradient methods for the amplitude-based squared loss in terms of critical points.
Lastly, we compare their performance numerically. That is, the robustness of reconstruction under noise and the computation times are reported for synthetic data and the resulting reconstructed objects are presented.