Assign modules on offcanvas module position to make them visible in the sidebar.

Main Menu (2)

Service POV / POC Hadoop as the Data Lake

The amount of data is growing incessantly every day.  It’s variety in terms of character and method of inflow is on the one hand a great challenge for companies, on the other, however, the value hidden in them is too precious to not use it. The Big Data world is for a vast number of enterprises a new, unexplored area, without which, however, further development and competitiveness will be difficult to maintain.


To address the need to learn about Big Data solutions, willing to make the beginning of this adventure easy and simple, we have made the service that gives You a chance to check the usefulness of the Hadoop platform as a huge company data repository. Thanks to the service of POV / POC Hadoop as the Data Lake we will not only create such a solution, irrespectively on-premise or in the cloud, but also help in the migration of relevant data from previously used databases, transactional and analytical, and other sources that will enrich data and gain valuable insight in company operations. The opportunity to learn about the new features offered by this modern platform will be more valuable that it will take place in the target environment and with real data, which guarantees reliable information and indicators showing new possibilities in the field of data storage and modern data analysis from a variety of sources.


The value added for the customer is undoubtedly the opportunity to test integration methods and work on the new platform without the need for expensive investment


Service stages:

 

Analysis and planning

At this stage, an insightful analysis of the client's current environment will be made, as well as the planned and described activities aimed to copying selected customer data to the Hadoop system. Depending on database systems and other sources, the most appropriate migration method will be chosen. A schedule and implementation plan will be presented. For the sake of security and ensuring the continuity of the client's business processes, the described service will be carried out with the utmost care for the data and performance of production systems.

 

Migration / transfer of data to the Hadoop platform

At this stage, a new server will be installed, where the Hadoop ecosystem will eventually work if an on-premise solution is chosen or created a similar environment in the selected public cloud.

All connections required for data transfer will be made. Subsequently the migration and configuration of the analytical and visualizing tools will be carried out.

 

Post-migration services:

Services presenting new features of the Hadoop solution (in the form of a small workshops) will be delivered at this stage

  • Administration tools
  • Data access tools
  • Data visualization tools
  • Technical documentation of the solution

 

Additional services:

At the customer request, it is possible and strongly recommended to plan and implement additional workshops, the scope of which is to be specified. These can be, for example:

  • Integration with other client systems
  • Assistance in matching the Hadoop ecosystem projects to the client's business requirements
  • Data security mechanisms
  • Generally understood topic of Hadoop cluster security and the possibility of implementing these mechanisms

 

Customer values associated with the service:

  • Unique opportunity to test the behavior of the Hadoop system as Data Lake for the main systems storing customer’s data
  • The chance to learn about the modern elements of the Hadoop ecosystem, to analyze and display data and their behavior in the current customer’s infrastructure
  • The opportunity to learn about the latest market trends in the field of Big Data and Big Data Analytic