We introduce low regularity exponential-type integrators for nonlinear Schrödinger equations for which first-order convergence only requires the boundedness of one additional derivative of the solution. More precisely, we will prove first-order convergence in H$^r$ for solutions in H$^r$$^+$$^1$ (r > d/2) of the derived schemes. This allows us lower regularity assumptions on the data than for instance required for classical splitting or exponential integration schemes. For one dimensional quadratic Schrödinger equations we can even prove first-order convergence without any loss of regularity. Numerical experiments underline the favorable error behavior of the newly introduced exponential-type integrators for low regularity solutions compared to classical splitting and exponential integration schemes.