PROJECT REPORT - https://github.com/cloudmesh/sp17-i524/blob/master/project/S17-IR-P003/report/report.pdf
git clone https://github.com/cloudmesh/cloudmesh.street.git
After getting local copy of the git repository, Go to directory "./ansible" UPDATE THE FOLLOWING VARIABLES IN "user_vars.yaml"
---
###########################################################
#Variables for execution of complete package
#EDIT FOLLOWING DETAILS AS PER REQUIREMENT
##############CLOUDMESH SETTINGS###########################
#cloud: "chameleon" or "jetstream"
cloud: <TBD>
#username: "cloudmesh username as key_name"
username: <TBD>
############HADOOP CLUSTER SETTINGS########################
#Chameleon image_name: CC-Ubuntu14.04
#jetstream image_name: ubuntu-14.04-trusty-server-cloudimg
#flavor: m1.small, m1.medium, m1.large[preferred: m1.medium]
#addons: spark pig hive
image_name: <TBD>
count: <TBD>
flavor: <TBD>
addons: <TBD>
###########################################################
---
############################################################
#Variables for execution of complete package
#EDIT FOLLOWING DETAILS AS PER REQUIREMENT
##############CLOUDMESH SETTINGS############################
#cloud: "chameleon" or "jetstream"
cloud: "chameleon"
#username: "cloudmesh username as key_name"
username: "rraghata"
############HADOOP CLUSTER SETTINGS#########################
#Chameleon image_name: CC-Ubuntu14.04
#jetstream image_name: ubuntu-14.04-trusty-server-cloudimg
#flavor: m1.small, m1.medium, m1.large [preferred: m1.medium]
#addons: spark pig hive
image_name: "CC-Ubuntu14.04"
count: "6"
flavor: "m1.medium"
addons: "spark"
############################################################
Go to directory '/cloudmesh.street/code/scripts'
STEP 3: To install ansible, cloudmesh client for the first-time and run complete package run following script:
. run_all.sh
{Note: Skip 3.1 if already installed )
. setup.sh
The above script when runs uses playbook--> ansible/local_setup.yaml Edit the cloudmesh.yaml file as per requirements [DETAILS GIVEN IN APPENDIX]
. configure.sh
The above script when runs uses playbook--> ansible/cloud_config.yaml
. deploy.sh
The above script when runs ,uses playbook--> ansible/hadoop_deploy.yaml
. opencv_setup.sh
The above script when runs ,uses playbook--> ansible/opencv_setup.yaml
3.5 Run the script sign_detection.sh to perform the sign detection analysis over cloud spark cluster:
. sign_detection.sh
The above script when runs ,uses playbook--> ansible/sign_detection.yaml
The images dataset as well as sample video are present in project directory [Details given in Appendix below], the default program performs sign detection on images.
To perform video analysis, Update the following file for last task:
cloudmesh.street/ansible/roles/analysis/tasks/main.yml
with
su - hadoop -c "spark-submit --master yarn --deploy-mode client --executor-memory 1g --driver-memory 2g --name signdetection --conf "spark.app.id=signdetection" /opencv_workspace/code/signdetectionbyspark.py /opencv_workspace/test_data/videos/ /opencv_workspace/output/"
NOTE: You might run in to memory issues if you use m1.small flavors for cluster creation, since the jobs need a minimum of medium flavor to run. For video, m1.large flavor is preferable for spark computing.
3.6 Run the script transfer.sh to get the output from remote vms(cloud) to local machine for visual confirmation:
. transfer.sh
The above script when runs ,uses playbook--> ansible/transfer_output_to_local.yaml
The output gets stored at "cloudmesh.street/ansible/output"
. clean.sh
The above script when runs ,uses playbook--> env_clean.yaml It deletes all the VMS, undefines all the clusters, delete the output directory and deletes the stacks.
. benchmark.sh
The above script runs all the scripts[from ../code/scripts] with output in "./benchmark/benchmark_time" file for every script
A.1.1 Edit ~/.cloudmesh/cloudmesh.yaml for following sections, edit <''>/ in the file correct credentials:
profile:
firstname: <first name>
lastname: <last name>
email: <email id>
user: <chameleon/jetsream/other cloud username>
active:
- chameleon
clouds:
...
A.1.3 Edit the configuration for the active cloud below it from the list(kilo/chameleon/jestream/..), the entry with should be customized as per your credentials.
Chameleon Example:
credentials:
OS_PASSWORD: <enter your chameleon cloud password here>
OS_TENANT_NAME: CH-818664
OS_TENANT_ID: CH-818664
OS_PROJECT_NAME: CH-818664
OS_USERNAME: <username>
default:
flavor: m1.medium
image: CC-Ubuntu14.04
Following directories are included as sample test-data:
./images/ - 50 images dataset
./videos/ - 1 video stop_video_1.mp4 (2 sec)
Note: Currently, code supports mp4 video files only.
"STOP SIGN CLASSIFIER" has been provided in "ansible/roles/analysis/files/classifier/"
Multiple classifiers can be added to the directory if you have.