We study the minimization problem of a non-convex sparsity promoting penalty function, the transformed \(l_1\) (TL1), and its application in compressed sensing (CS). The TL1 penalty interpolates \(l_0\) and \(l_1\) norms through a nonnegative parameter \(a \in (0,+\infty)\), similar to \(l_p\) with \(p \in (0,1]\). TL1 is known in the statistics literature to enjoy three desired properties: unbiasedness, sparsity and Lipschitz continuity. We first consider the constrained minimization problem and prove the uniqueness of global minimizer and its equivalence to \(l_0\) norm minimization if the sensing matrix \(A\) satisfies a restricted isometry property (RIP) and if \(a > a^*\), where \(a^*\) depends only on \(A\). The solution is stable under noisy measurement. For general sensing matrix \(A\), we show that the support set of a local minimizer corresponds to linearly independent columns of \(A\), and recall sufficient conditions for a critical point to be a local minimum. Next, we present difference of convex algorithms for TL1 (DCATL1) in computing TL1-regularized constrained and unconstrained problems in CS. For the unconstrained problem, we prove convergence of DCALT1 to a stationary point satisfying the first order optimality condition. Finally in numerical experiments, we identify the optimal value \(a=1\), and compare DCATL1 with other CS algorithms on three classes of sensing matrices: Gaussian random matrices, over-sampled discrete cosine transform matrices (ODCT), and uniformly distributed M-sphere matrices. We find that for all three classes of sensing matrices, the performance of DCATL1 algorithm (initiated with \(L_1\) minimization) always ranks near the top (if not the top), and is the most robust choice insensitive to RIP (incoherence) of the underlying CS problems.