Free Cheat-test Samples and Demo Questions Download
Adobe exams Adobe
Apple exams Apple
Avaya exams Avaya
Check Point exams Check Point
Cisco exams Cisco
Citrix exams Citrix
CIW exams CIW
CompTIA exams CompTIA
CWNP exams CWNP
EC-Council exams EC-Council
EMC exams EMC
Exin exams Exin
Fortinet exams Fortinet
GIAC exams GIAC
Hitachi exams Hitachi
HP exams HP
IBM exams IBM
Isaca exams Isaca
ISC exams ISC
ISEB exams ISEB
Juniper exams Juniper
LPI exams LPI
McAfee exams McAfee
Microsoft exams Microsoft
Oracle exams Oracle
PMI exams PMI
Riverbed exams Riverbed
SNIA exams SAP
Sun exams SAS
Symantec exams Symantec
VMware exams VMware
All certification exams

Cloudera CCA-470 Exam - Cheat-Test.com

Free CCA-470 Sample Questions:

Q: 1
On a cluster running MapReduce v1 (MRv1), a MapReduce job is given a directory of 10 plain text as its input directory. Each file is made up of 3 HDFS blocks. How many Mappers will run?
A. We cannot say; the number of Mappers is determined by the developer
B. 30
C. 10
D. 1
Answer: B

Q: 2
Your developers request that you enable them to use Hive on your Hadoop cluster. What do install and/or configure?
A. Install the Hive interpreter on the client machines only, and configure a shared remote Hive Metastore.
B. Install the Hive Interpreter on the client machines and all the slave nodes, and configure a shared remote Hive Metastore.
C. Install the Hive interpreter on the master node running the JobTracker, and configure a shared remote Hive Metastore.
D. Install the Hive interpreter on the client machines and all nodes on the cluster
Answer: A

Q: 3
How must you format the underlying filesystem of your Hadoop cluster’s slave nodes running on Linux?
A. They may be formatted in nay Linux filesystem
B. They must be formatted as HDFS
C. They must be formatted as either ext3 or ext4
D. They must not be formatted - - HDFS will format the filesystem automatically
Answer: C

Q: 4
Where does a MapReduce job store the intermediate data output from Mappers?
A. On the underlying filesystem of the local disk machine on which the JobTracker ran.
B. In HDFS, in the job’s output directory.
C. In HDFS, in temporary directory defined mapred.tmp.dir.
D. On the underlying filesystem of the local disk of the machine on which the Mapper ran.
E. Stores on the underlying filesystem of the local disk of the machine on which the Reducer.
Answer: D

Q: 5
Which two features does Kerberos security add to a Hadoop cluster?
A. Authentication for user access to the cluster against a central server
B. Encryption for data on disk ("at rest")
C. Encryption on all remote procedure calls (RPCs)
D. User authentication on all remote procedure calls (RPcs)
E. Root access to the cluster for users hdfs and mapred but non-root acess clients
Answer: C,D

Q: 6
Identity four pieces of cluster information that are stored on disk on the NameNode?
A. A catalog of DataNodes and the blocks that are stored on them.
B. Names of the files in HDFS.
C. The directory structure of the files in HDFS.
D. An edit log of changes that have been made since the last snapshot of the NameNode.
E. An edit log of changes that have been made since the last snapshot compaction by the
Secondary NameNode.
F. File permissions of the files in HDFS.
G. The status of the heartbeats of each DataNode.
Answer: B,C,E,G

Q: 7
Which command does Hadoop offer to discover missing or corrupt HDFS data?
A. The map-only checksum utility,
B. Fsck
C. Du
D. Dskchk
E. Hadoop does not provide any tools to discover missing or corrupt data; there is no need because three replicas are kept for each data block.
Answer: B

Q: 8
What does CDH packaging do on install to facilitate Kerberos security setup?
A. Automatically configure permissions for log files at $MAPPED_LOG_DIR/userlogs
B. Creates and configures you kdc with default cluster values.
C. Creates users for hdfs and mapreduce to facilitate role assignment.
D. Creates a set of pre-configured Kerberos keytab files and their permissions.
E. Creates directories for temp, hdfs, and mapreduce with correct permissions.
Answer: C


© 2014 Cheat-Test.com, All Rights Reserved