While both services provide execution tracking, handling retries and exceptions, and running arbitrary actions, AWS Data Pipeline is specifically designed to facilitate the specific steps that are common across a majority of data-driven workflows. For example: executing activities after their input data meets specific readiness criteria, easily copying data between different data stores, and scheduling chained transforms. This highly specific focus means that Data Pipeline workflow definitions can be created rapidly and with no code or programming knowledge.
Whereas Simple Workflow service is a very powerful service. You can write even your workflow logic using it. Example: Most of the e-commerce systems have scalability problems in their order systems. You can use write code in SWF to make this ordering workflow process itself.