M. T. Turnbull, P. G. Petrov, C. S. Embrey, A. M. Marino, V. Boyer
Non-degenerate forward four-wave mixing in hot atomic vapors has been shown to produce strong quantum correlations between twin beams of light [McCormick et al, Opt. Lett. 32, 178 (2007)], in a configuration which minimizes losses by absorption. In this paper, we look at the role of the phase-matching condition in the trade-off that occurs between the efficiency of the nonlinear process and the absorption of the twin beams. To this effect, we develop a semi-classical model by deriving the atomic susceptibilities in the relevant double-lambda configuration and by solving the classical propagation of the twin-beam fields for parameters close to those found in typical experiments. These theoretical results are confirmed by a simple experimental study of the nonlinear gain experienced by the twin beams as a function of the phase mismatch. The model shows that the amount of phase mismatch is key to the realization of the physical conditions in which the absorption of the twin beams is minimized while the cross-coupling between the twin beams is maintained at the level required for the generation of strong quantum correlations. The optimum is reached when the four-wave mixing process is not fully phase matched.
View original:
http://arxiv.org/abs/1303.7187
No comments:
Post a Comment