Most of us know the story of the woodcutter, who went to work for a timber merchant. His salary was based on the number of trees he could cut for the day. On the first day, he could cut 20 trees. Happy with the result, and even more motivated, he tried harder the next day and came back with 30 trees. However, his success was short-lived. After a week, the number of trees he was cutting was dwindling. “I must be losing my strength,” the woodcutter thought. He decided to go and see his boss and apologized for not living up to the expectations. He was taken aback, when his boss asked him,
“When did you last sharpen your axe?”
He replied, “I have been busy cutting more trees for you and I just did not have the time to sharpen my axe.”
Cut to modern world: I am sure many of us can relate to this story to some extent. What is crystal clear is this: the axe — or in today’s context, technology — needs to be upgraded constantly, without which progress is impossible.
Most of the companies today are waking up to the needs of technology across all domains and reaping the benefits. Let’s take the example of book retailers. Earlier, traditional booksellers in stores could easily track which books were popular and which were not, based on the number of those particular books sold. If there was a loyalty program, they could tie some of those purchases to individual customers. That was just it.
But once the focus was on online shopping, there has been a 360 degree turnaround in understanding customers. Online retailers — the biggest example being Amazon — were able to track not only what customers bought, but also their viewing history; how did they navigate; how did the reviews, page layout and promotions influence them. They even came up with algorithms to predict which book a particular customer would love to read next. Booksellers in physical stores just could not have this kind of information.
No prizes for guessing why Amazon pushed several bookstores out of business. It is evident that it tapped in to the need to manage its volumes of customer data that was being overlooked by the traditional booksellers.
This is where big data comes into the picture. The hype surrounding big data is not just hype. We now live in a world that is dominated by big data, whether we accept the fact or not. The amount of data doubling every day across the world is undeniable. Pat Gelsinger, the CEO of VMware, has rightly said, “Data is the new science. Big data holds the answers.” Using that data effectively is the crux of the matter.
Companies like Facebook and Twitter have been efficiently using big data for quite some time now. Today, organizations across all domains whether big MNCs or startups — be it social media or health care or finance or airlines — are embracing the big data wave and are investing big time in it. The domino effect of these upgrades and new initiatives are bringing in a lot of changes in job titles and job roles.
But the big question is: Are professionals ready to upgrade to the latest technology and take up new challenges? Shifting to big data is imperative, as it touches nearly every aspect of our lives, whether we realize it or not.
Technology moves at a very fast pace. And, if a Java professional is still fiddling with Java 1.3 code, he needs to look past and upgrade to the most up-to-date technology. Big data and Hadoop are synonymous. Going by the demand and the growing popularity, Hadoop, an open-source — Java-based programming structure — rules the market today. International Data Corporation predicts that the big data and Hadoop market worldwide will hit the $23.8 billion mark by 2016.
Is it prudent for a Software testing professional to jump the Hadoop bandwagon? The answer, I am sure, will be ‘yes’ for many. A testing professional’s job, which entails ironing out bugs and improving the quality of the finished product, can get monotonous at times. He may feel stuck in the rut after a point doing the same kind of work day in and day out. This is when the need to upgrade his skills to big data and Hadoop can come in handy. His realm of opportunities will also open up.
Even a mainframe professional’s work involves bulk data processing. And, handling volumes of unstructured data can be time consuming besides getting monotonous. Take the case of a person, who is involved in census data processing in mainframe. His job includes monitoring and collecting questionnaires, checking, data entry, storage, tabulation etc. This can get mind boggling, right? The process is not only time consuming, but also expensive. Hadoop being an open-source platform can be the most viable alternative to manage volumes of data for him. With Hadoop he will also have better career opportunities that are increasing by the day.
What about the data warehousing professional and the ETL developer, who handle loads and loads of data? Given the enormous flow of data today, they get so caught up in this data that their work is restricted to just handling the flow of data. But by upgrading to Hadoop, these professionals can effortlessly handle volumes of data. Also, how can they forget the big opportunities in the data management sector?
There is also the Business Intelligence professional whose challenge lies in storing Big Data. For example, in an advertising agency, he will constantly need answers to analytic questions, such as: What drives people to certain content? What’s their profile? How do we draw more people to an area? It is only with the help of Hadoop that he can scale up and deliver good answers frequently.
Whether you are a Java professional or a software testing engineer or business intelligence professional, there is no debating the fact that big data technologies are becoming a common accompaniment. Therefore, you need to look beyond and upgrade to the challenges of big data technologies. Lest, you become like the woodcutter who was so busy felling trees that he forgot to sharpen his axe.
Is your profession/doma
Got a question for us? Mention them in the comments section and we will get back to you.