CDH is basically a packaged deal, where you have a good interface with everything installed in it.
Spark Version may be 1.6 now, but it might be upgraded later.
If you want to work on spark, its better you install hadoop and spark separately in your VM or System. That'll be enough. You can add other frameworks later when needed.