Use Cases Of MongoDB
MongoDB is a relatively new contender in the data storage circle compared to giant like Oracle and IBM DB2, but it has gained huge popularity with their distributed key value store, MapReduce calculation capability and document oriented NoSQL features.
MongoDB has ...
Introduction to Impala
Impala in terms of Hadoop has got the significance because of its,
Scalability
Flexibility
Efficiency
What’s Impala?
Impala is…
Interactive SQL–Impala is typically 5 to 65 times faster than Hive as it minimized the response time to just seconds, not minutes.
Nearly ANSI-92 standard and compatible with ...
Free Cloudera Impala Book
Get free Cloudera Impala, in PDF format, for free from the Cloudera website, in association with the Strata Conference and Hadoop World. See the below link for the book info from the publisher as well as the link to download ...
Hadoop Interview Questions – HDFS
Are you planning to pursue a career in Hadoop or looking out for a job opportunity in Hadoop? Here is list of Hadoop interview questions which covers HDFS
What is the difference between a Hadoop database and Relational Database?
Hadoop is not ...
Data Export from Hadoop MapReduce to Database
Hadoop has become a huge part of Data Warehouse in most companies. It is used for a variety of use-cases: Search and Web Indexing, Machine learning, Analytics and Reporting, and so on. Most organizations are building Hadoop clusters in addition ...
Hadoop Cluster Commissioning and Decommissioning Nodes
To add new nodes to the cluster:
1. Add the network addresses of the new nodes to the include file.
hdfs-site.xml
<property>
<name>dfs.hosts</name>
<value>/<hadoop-home>/conf/includes</value>
<final>true</final>
</property>
mapred-site.xml
<property>
<name>mapred.hosts</name>
<value>/<hadoop-home>/conf/includes</value>
<final>true</final>
</property>
Datanodes that are permitted to connect to the namenode are specified in a
file whose name is specified by the dfs.hosts property.
Includes file ...
BigData TechCon-Learn HOW TO Master Big Data, Mar 31-Apr 2, Boston
Big Data TechCon, March 31-April 2, Boston, is the “how-to” big data event. Use code BIGDATA for $200 discount. www.bigdatatechcon.com
Plan now to attend Big Data TechCon, March 31-April 2 in Boston, to learn HOW-TO accommodate the terabytes and petabytes of data ...
MongoDB Interview Questions
What were you trying to solve when you created MongoDB?
We were and are trying to build the database that we always wanted as developers. For pure reporting, SQL and relational is nice, but when building data always wanted something different: ...
Hadoop Interview Questions – MapReduce
Looking out for Hadoop Interview Questions that are frequently asked by employers?
What is MapReduce?
It is a framework or a programming model that is used for processing large data sets over clusters of computers using distributed programming.
What are 'maps' and 'reduces'?
'Maps' ...
3 Tools Companies Can Use to Harness the Power of Big Data
To the individual user, Big Data might simply mean a new 3-terabyte hard drive, which can be acquired for a hundred bucks or so. But real Big Data projects require clusters of servers, vast amounts of storage, and specialized software ...






