Certification translates to an advantage in the work place. Many employers now require certification because they know that individuals who are certified are commited to their profession and to quality patient care. Certified individuals also tend to enjoy higher wages and greater promotional opportunities.