I'd like to point a few things.
If you want to do a POC with just 1 laptop, there's little point in using Hadoop.
Also, as said by other people, Hadoop is not designed for realtime application, because there is some overhead in running Map/Reduce jobs.
That being said, Cloudera released Impala which works with the Hadoop ecosystem (specifically the Hive metastore) to achieve realtime performance. Be aware that to achieve this, it does not generate Map/Reduce jobs, and is currently in beta, so use it carefully.
So I would really advise going at Impala so you can still use an Hadoop ecosystem, but if you're also considering alternatives here are a few other frameworks that could be of use:
- Druid : was open-sourced by MetaMarkets. Looks interesting, even though I've not used it myself.
- Storm : no integration with HDFS, it just processes data as it comes.
- HStreaming : integrates with Hadoop.
- Yahoo S4 : seems pretty close to Storm.
In the end I think you should really analyze your needs, and see if using Hadoop is what you need, because it's only getting started in the realtime space. There are several other projects which could help you achieve realtime performance.
If you want ideas of projects to showcase, I suggest looking at this link. Her are some examples:
- Classify investment opportunities as good or not e.g. based on industry/company metrics, portfolio diversity and currency risk.
- Classify credit card transactions as valid or invalid based e.g. location of transaction and credit card holder, date, amount, purchased item or service, history of transactions and similar transactions.
- Classification of proteins into structural or functional classes
- Diagnostic classification, e.g. cancer tumours based on images
- Document Classification and Ranking
- Malware classification, email/tweet/web spam classification
- Production Systems (e.g. in energy or petrochemical industries)
- Classify and detect situations (e.g. sweet spots or risk situations) based on realtime and historic data from sensors