Build #89

Build: #89 was successful

Job: Default Job was successful

Stages & jobs

  1. Default Stage

Build log

The build generated 340 lines of output. Download or view full build log

08-Apr-2018 00:01:54 Build CDAP Guides - Cube Guide - release-cdap-3.4-compatible - Default Job #89 (CG-DCCP5-JOB1-89) started building on agent bamboo-agent02.prod.continuuity.net
08-Apr-2018 00:01:54 Remote Agent
08-Apr-2018 00:01:54 Build working directory is /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1
08-Apr-2018 00:01:54 Executing build CDAP Guides - Cube Guide - release-cdap-3.4-compatible - Default Job #89 (CG-DCCP5-JOB1-89)
08-Apr-2018 00:01:54 Starting task 'Checkout Default Repository' of type 'com.atlassian.bamboo.plugins.vcs:task.vcs.checkout'
08-Apr-2018 00:01:54 Updating source code to revision: 75672301d54839078069bb67cefce1dac2caee82
08-Apr-2018 00:01:54 Fetching 'refs/heads/release/cdap-3.4-compatible' from 'git@github.com:cdap-guides/cdap-cube-guide.git'. Will try to do a shallow fetch.
08-Apr-2018 00:01:55 Warning: Permanently added '[127.0.0.1]:39980' (RSA) to the list of known hosts.
08-Apr-2018 00:01:55 Checking out revision 75672301d54839078069bb67cefce1dac2caee82.
08-Apr-2018 00:01:55 Already on 'release/cdap-3.4-compatible'
08-Apr-2018 00:01:55 Updated source code to revision: 75672301d54839078069bb67cefce1dac2caee82
08-Apr-2018 00:01:55 Finished task 'Checkout Default Repository' with result: Success
08-Apr-2018 00:01:55 Running pre-build action: VCS Version Collector
08-Apr-2018 00:01:55 Running pre-build action: Hung Build Killer PreBuildAction
08-Apr-2018 00:01:55 Starting task 'clean package' of type 'com.atlassian.bamboo.plugins.maven:task.builder.mvn3'
08-Apr-2018 00:01:55
Beginning to execute external process for build 'CDAP Guides - Cube Guide - release-cdap-3.4-compatible - Default Job #89 (CG-DCCP5-JOB1-89)'
... running command line:
/opt/maven/bin/mvn --batch-mode -Djava.io.tmpdir=/tmp/CG-DCCP5-JOB1 clean package
... in: /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1
... using extra environment variables:
bamboo_planRepository_1_branch=release/cdap-3.4-compatible
bamboo_planRepository_1_branchDisplayName=release/cdap-3.4-compatible
bamboo_repository_revision_number=75672301d54839078069bb67cefce1dac2caee82
MAVEN2_HOME=/opt/maven
bamboo_repository_51970152_git_username=
bamboo_docs_s3_bucket=docs.cask.co
bamboo_capability_system_builder_command_s3cmd=/usr/bin/s3cmd
bamboo_buildKey=CG-DCCP5-JOB1
bamboo_repository_51970152_git_branch=release/cdap-3.4-compatible
bamboo_gpg_password=********
bamboo_ops_gpg_password=********
bamboo_shortJobName=Default Job
bamboo_buildResultsUrl=http://builds.cask.co/browse/CG-DCCP5-JOB1-89
bamboo_repository_s3_bucket=repository.cask.co
bamboo_planRepository_repositoryUrl=git@github.com:cdap-guides/cdap-cube-guide.git
bamboo_capability_system_builder_mvn3_Maven_3_1=/opt/maven
bamboo_repository_51970152_git_repositoryUrl=git@github.com:cdap-guides/cdap-cube-guide.git
bamboo_aws_secret_access_key_password=********
bamboo_agentId=32243713
bamboo_capability_Swapfile=true
bamboo_planRepository_revision=75672301d54839078069bb67cefce1dac2caee82
bamboo_planRepository_previousRevision=75672301d54839078069bb67cefce1dac2caee82
bamboo_capability_RHEL=rhel
bamboo_capability_freight=true
bamboo_repository_branch_name=release/cdap-3.4-compatible
bamboo_kdc_address_password=********
bamboo_capability_GPG=GPG
bamboo_capability_system_jdk_Oracle_JDK_8=/usr/lib/jvm/jdk8
JAVA_HOME=/usr/lib/jvm/jdk1.7.0_75
bamboo_capability_system_jdk_Oracle_JDK_7=/usr/lib/jvm/jdk7
bamboo_downloads_s3_bucket=downloads.cask.co
bamboo_planRepository_branch=release/cdap-3.4-compatible
bamboo_market_s3_access_key_password=********
bamboo_planRepository_1_type=gitv2
bamboo_planRepository_branchName=release/cdap-3.4-compatible
bamboo_capability_SPHINX=Sphinx
bamboo_capability_rsync=true
bamboo_market_cloudfront_secret_key_password=********
bamboo_capability_system_jdk_JDK=/usr/lib/jvm/jdk1.7.0_75
bamboo_planRepository_type=gitv2
bamboo_planRepository_1_username=
bamboo_capability_expect=true
bamboo_capability_RSYNC=rsync
useMavenReturnCode=false
bamboo_capability_system_builder_node_Node_js=/usr/bin/node
bamboo_capability_system_builder_mvn3_Maven_3=/opt/maven
bamboo_planKey=CG-DCCP5
bamboo_hide_jdk_mirror_password=********
bamboo_planRepository_username=
bamboo_repository_51970152_branch_name=release/cdap-3.4-compatible
bamboo_repository_51970152_name=CDAP Cube guide
bamboo_planRepository_1_branchName=release/cdap-3.4-compatible
bamboo_capability_s3cmd=true
bamboo_resultsUrl=http://builds.cask.co/browse/CG-DCCP5-JOB1-89
bamboo_capability_Sphinx=true
bamboo_planRepository_1_name=CDAP Cube guide
bamboo_build_working_directory=/var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1
bamboo_shortPlanName=release-cdap-3.4-compatible
bamboo_planRepository_name=CDAP Cube guide
bamboo_capability_git_lfs=true
bamboo_buildNumber=89
bamboo_planName=CDAP Guides - Cube Guide - release-cdap-3.4-compatible
bamboo_shortPlanKey=DCCP5
bamboo_shortJobKey=JOB1
bamboo_repository_previous_revision_number=75672301d54839078069bb67cefce1dac2caee82
bamboo_repository_51970152_previous_revision_number=75672301d54839078069bb67cefce1dac2caee82
bamboo_buildTimeStamp=2018-04-08T00:00:03.940Z
M2_HOME=/opt/maven
bamboo_s3_bucket=repository.cask.co
bamboo_buildResultKey=CG-DCCP5-JOB1-89
bamboo_capability_system_builder_command_Docker=/usr/bin/docker
bamboo_repository_git_branch=release/cdap-3.4-compatible
bamboo_market_s3_secret_key_password=********
bamboo_buildPlanName=CDAP Guides - Cube Guide - release-cdap-3.4-compatible - Default Job
bamboo_capability_system_builder_command_fpm=/opt/rbenv/shims/fpm
bamboo_capability_system_jdk_JDK_1_8_0_101=/usr/lib/jvm/jdk1.8.0_101
bamboo_planRepository_1_revision=75672301d54839078069bb67cefce1dac2caee82
bamboo_capability_Platform=rhel
bamboo_repository_name=CDAP Cube guide
bamboo_capability_system_docker_executable=/usr/bin/docker
bamboo_agentWorkingDirectory=/var/bamboo/xml-data/build-dir
bamboo_capability_system_git_executable=/usr/local/bin/git
bamboo_planRepository_1_previousRevision=75672301d54839078069bb67cefce1dac2caee82
bamboo_repository_git_username=
bamboo_capability_system_builder_command_Ruby=/opt/rbenv/shims/ruby
bamboo_planRepository_branchDisplayName=release/cdap-3.4-compatible
bamboo_aws_access_key_password=********
bamboo_capability_system_builder_mvn3_Maven_3_x=/opt/maven
bamboo_plan_storageTag=CG-DCCP5
bamboo_repository_git_repositoryUrl=git@github.com:cdap-guides/cdap-cube-guide.git
bamboo_repository_51970152_revision_number=75672301d54839078069bb67cefce1dac2caee82
bamboo_market_cloudfront_access_key_password=********
bamboo_working_directory=/var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1
bamboo_planRepository_1_repositoryUrl=git@github.com:cdap-guides/cdap-cube-guide.git
bamboo_capability_createrepo=true
bamboo_capability_system_jdk_JDK_1_8=/usr/lib/jvm/jdk1.8.0_101
bamboo_capability_rpm_build=true
bamboo_capability_system_jdk_JDK_1_7=/usr/lib/jvm/jdk1.7.0_75
PATH=/usr/lib/jvm/jdk1.7.0_75/bin:/usr/lib/jvm/java/bin:/opt/rbenv/shims:/opt/rbenv/bin:/opt/rbenv/plugins/ruby_build/bin:/usr/local/maven-3.1.1/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/bamboo/bin
08-Apr-2018 00:01:57 [INFO] Scanning for projects...
08-Apr-2018 00:01:58 [INFO]                                                                         
08-Apr-2018 00:01:58 [INFO] ------------------------------------------------------------------------
08-Apr-2018 00:01:58 [INFO] Building CDAP Cube Dataset Guide 1.0.0
08-Apr-2018 00:01:58 [INFO] ------------------------------------------------------------------------
08-Apr-2018 00:02:02 [INFO]
08-Apr-2018 00:02:02 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ cdap-cube-guide ---
08-Apr-2018 00:02:02 [INFO] Deleting /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1/target
08-Apr-2018 00:02:02 [INFO]
08-Apr-2018 00:02:02 [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ cdap-cube-guide ---
08-Apr-2018 00:02:03 [INFO] Using 'UTF-8' encoding to copy filtered resources.
08-Apr-2018 00:02:03 [INFO] skip non existing resourceDirectory /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1/src/main/resources
08-Apr-2018 00:02:03 [INFO]
08-Apr-2018 00:02:03 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ cdap-cube-guide ---
08-Apr-2018 00:02:03 [INFO] Changes detected - recompiling the module!
08-Apr-2018 00:02:03 [INFO] Compiling 4 source files to /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1/target/classes
08-Apr-2018 00:02:04 [WARNING] /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1/src/main/java/co/cask/cdap/guides/cube/WebAnalyticsApp.java: /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1/src/main/java/co/cask/cdap/guides/cube/WebAnalyticsApp.java uses unchecked or unsafe operations.
08-Apr-2018 00:02:04 [WARNING] /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1/src/main/java/co/cask/cdap/guides/cube/WebAnalyticsApp.java: Recompile with -Xlint:unchecked for details.
08-Apr-2018 00:02:04 [INFO]
08-Apr-2018 00:02:04 [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ cdap-cube-guide ---
08-Apr-2018 00:02:04 [INFO] Using 'UTF-8' encoding to copy filtered resources.
08-Apr-2018 00:02:04 [INFO] skip non existing resourceDirectory /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1/src/test/resources
08-Apr-2018 00:02:04 [INFO]
08-Apr-2018 00:02:04 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ cdap-cube-guide ---
08-Apr-2018 00:02:04 [INFO] Changes detected - recompiling the module!
08-Apr-2018 00:02:04 [INFO] Compiling 1 source file to /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1/target/test-classes
08-Apr-2018 00:02:05 [INFO]
08-Apr-2018 00:02:05 [INFO] --- maven-surefire-plugin:2.14.1:test (default-test) @ cdap-cube-guide ---
08-Apr-2018 00:02:05 [INFO] Surefire report directory: /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1/target/surefire-reports
08-Apr-2018 00:02:05
08-Apr-2018 00:02:05 -------------------------------------------------------
08-Apr-2018 00:02:05 T E S T S
08-Apr-2018 00:02:05 -------------------------------------------------------
08-Apr-2018 00:02:06 Running co.cask.cdap.guides.cube.WebAnalyticsAppTest
08-Apr-2018 00:02:09 2018-04-08 00:02:09,841 - INFO  [DatasetOpExecutorService STARTING:c.c.c.d.d.d.s.e.DatasetOpExecutorService@81] - Starting DatasetOpExecutorService...
08-Apr-2018 00:02:09 2018-04-08 00:02:09,930 - INFO  [DatasetOpExecutorService STARTING:c.c.c.d.d.d.s.e.DatasetOpExecutorService@96] - DatasetOpExecutorService started successfully on /127.0.0.1:46812
08-Apr-2018 00:02:10 2018-04-08 00:02:10,194 - INFO  [DatasetService:c.c.c.d.d.d.s.DatasetService@123] - Starting DatasetService...
08-Apr-2018 00:02:10 2018-04-08 00:02:10,209 - INFO  [DatasetTypeManager STARTING:c.c.c.d.d.InMemoryDatasetFramework@247] - Created dataset dataset:system.datasets.instance of type co.cask.cdap.data2.datafabric.dataset.service.mds.DatasetInstanceMDS
08-Apr-2018 00:02:10 2018-04-08 00:02:10,239 - INFO  [DatasetTypeManager STARTING:c.c.c.d.d.InMemoryDatasetFramework@247] - Created dataset dataset:system.datasets.type of type co.cask.cdap.data2.datafabric.dataset.service.mds.DatasetTypeMDS
08-Apr-2018 00:02:10 2018-04-08 00:02:10,366 - INFO  [NettyHttpService STARTING:c.c.c.d.d.d.s.DatasetTypeHandler@83] - Starting DatasetTypeHandler
08-Apr-2018 00:02:10 2018-04-08 00:02:10,379 - INFO  [DatasetService:c.c.c.d.d.d.s.DatasetService$1@138] - Discovered dataset.executor service
08-Apr-2018 00:02:10 2018-04-08 00:02:10,379 - INFO  [DatasetService:c.c.c.d.d.d.s.DatasetService@185] - Waiting for dataset.executor service to be discoverable
08-Apr-2018 00:02:10 2018-04-08 00:02:10,380 - INFO  [DatasetService:c.c.c.d.d.d.s.DatasetService@158] - Announcing DatasetService for discovery...
08-Apr-2018 00:02:10 2018-04-08 00:02:10,382 - INFO  [DatasetService:c.c.c.d.d.d.s.DatasetService@172] - DatasetService started successfully on /127.0.0.1:40375
08-Apr-2018 00:02:10 2018-04-08 00:02:10,643 - INFO  [main:c.c.c.m.q.MetricsQueryService@80] - Configuring MetricsService , address: 127.0.0.1, backlog connections: 20000, execthreads: 20, bossthreads: 1, workerthreads: 10
08-Apr-2018 00:02:10 2018-04-08 00:02:10,644 - INFO  [MetricsQueryService STARTING:c.c.c.m.q.MetricsQueryService@94] - Starting Metrics Service...
08-Apr-2018 00:02:10 2018-04-08 00:02:10,652 - INFO  [MetricsQueryService STARTING:c.c.c.m.q.MetricsQueryService@96] - Started Metrics HTTP Service...
08-Apr-2018 00:02:10 2018-04-08 00:02:10,654 - INFO  [MetricsQueryService STARTING:c.c.c.m.q.MetricsQueryService@110] - Metrics Service started successfully on /127.0.0.1:34540
08-Apr-2018 00:02:10 2018-04-08 00:02:10,763 - WARN  [main:c.c.c.i.a.r.a.ArtifactRepository@104] - Ignoring /opt/cdap/master/artifacts because it is not a directory.
08-Apr-2018 00:02:10 2018-04-08 00:02:10,835 - INFO  [LocalSchedulerService STARTING:o.q.s.RAMJobStore@155] - RAMJobStore initialized.
08-Apr-2018 00:02:11 2018-04-08 00:02:11,045 - INFO  [netty-executor-9:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.schedulestore, type name: co.cask.cdap.api.dataset.table.Table, properties: {}
08-Apr-2018 00:02:11 2018-04-08 00:02:11,059 - INFO  [netty-executor-2:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.schedulestore, type meta: DatasetTypeMeta{name='co.cask.cdap.api.dataset.table.Table', modules=[DatasetModuleMeta{name='orderedTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryTableModule', jarLocation=null, types=[table, co.cask.cdap.api.dataset.table.Table], usesModules=[], usedByModules=[core, objectMappedTable, cube, usage, lineage]}]}, props: DatasetProperties{properties={}}
08-Apr-2018 00:02:11 2018-04-08 00:02:11,076 - WARN  [netty-executor-9:c.c.c.d.a.AuditPublishers@120] - Audit publisher is null, audit information will not be published
08-Apr-2018 00:02:11 2018-04-08 00:02:11,113 - INFO  [LocalSchedulerService STARTING:c.c.c.i.a.r.s.AbstractSchedulerService@67] - Started time scheduler
08-Apr-2018 00:02:11 2018-04-08 00:02:11,139 - INFO  [LocalSchedulerService STARTING:c.c.c.i.a.r.s.AbstractSchedulerService@76] - Started stream size scheduler
08-Apr-2018 00:02:11 2018-04-08 00:02:11,151 - INFO  [main:c.c.c.e.s.h.BaseHiveExploreService@223] - Active handle timeout = 86400 secs
08-Apr-2018 00:02:11 2018-04-08 00:02:11,151 - INFO  [main:c.c.c.e.s.h.BaseHiveExploreService@224] - Inactive handle timeout = 3600 secs
08-Apr-2018 00:02:11 2018-04-08 00:02:11,152 - INFO  [main:c.c.c.e.s.h.BaseHiveExploreService@225] - Cleanup job schedule = 60 secs
08-Apr-2018 00:02:11 2018-04-08 00:02:11,229 - INFO  [ExploreExecutorService STARTING:c.c.c.e.e.ExploreExecutorService@88] - Starting ExploreExecutorService...
08-Apr-2018 00:02:11 2018-04-08 00:02:11,239 - INFO  [ExploreExecutorService STARTING:c.c.c.e.s.h.BaseHiveExploreService@1081] - Checking for tables that need upgrade...
08-Apr-2018 00:02:11 2018-04-08 00:02:11,240 - INFO  [Hive14ExploreService STARTING:c.c.c.e.s.h.BaseHiveExploreService@296] - Starting BaseHiveExploreService...
08-Apr-2018 00:02:11 2018-04-08 00:02:11,691 - INFO  [netty-executor-7:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.metrics.v2.table.ts.1, type name: co.cask.cdap.data2.dataset2.lib.table.MetricsTable, properties: {hbase.splits=[[0,0,0,2],[0,0,0,3],[0,0,0,4],[0,0,0,5],[0,0,0,6],[0,0,0,7],[0,0,0,8],[0,0,0,9],[0,0,0,10],[0,0,0,11],[0,0,0,12]], dataset.table.ttl=7200, dataset.table.readless.increment=true}
08-Apr-2018 00:02:11 2018-04-08 00:02:11,720 - INFO  [netty-executor-7:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.metrics.v2.table.ts.1, type meta: DatasetTypeMeta{name='co.cask.cdap.data2.dataset2.lib.table.MetricsTable', modules=[DatasetModuleMeta{name='metricsTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryMetricsTableModule', jarLocation=null, types=[co.cask.cdap.data2.dataset2.lib.table.inmemory.InMemoryMetricsTable, co.cask.cdap.data2.dataset2.lib.table.MetricsTable], usesModules=[], usedByModules=[cube]}]}, props: DatasetProperties{properties={hbase.splits=[[0,0,0,2],[0,0,0,3],[0,0,0,4],[0,0,0,5],[0,0,0,6],[0,0,0,7],[0,0,0,8],[0,0,0,9],[0,0,0,10],[0,0,0,11],[0,0,0,12]], dataset.table.ttl=7200, dataset.table.readless.increment=true}}
08-Apr-2018 00:02:11 2018-04-08 00:02:11,739 - INFO  [metrics-cleanup:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.1
08-Apr-2018 00:02:11 2018-04-08 00:02:11,766 - INFO  [netty-executor-2:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.metrics.v2.entity, type name: co.cask.cdap.data2.dataset2.lib.table.MetricsTable, properties: {}
08-Apr-2018 00:02:11 2018-04-08 00:02:11,773 - INFO  [netty-executor-3:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.metrics.v2.entity, type meta: DatasetTypeMeta{name='co.cask.cdap.data2.dataset2.lib.table.MetricsTable', modules=[DatasetModuleMeta{name='metricsTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryMetricsTableModule', jarLocation=null, types=[co.cask.cdap.data2.dataset2.lib.table.inmemory.InMemoryMetricsTable, co.cask.cdap.data2.dataset2.lib.table.MetricsTable], usesModules=[], usedByModules=[cube]}]}, props: DatasetProperties{properties={}}
08-Apr-2018 00:02:11 2018-04-08 00:02:11,823 - INFO  [netty-executor-5:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.metrics.v2.table.ts.60, type name: co.cask.cdap.data2.dataset2.lib.table.MetricsTable, properties: {hbase.splits=[[0,0,0,2],[0,0,0,3],[0,0,0,4],[0,0,0,5],[0,0,0,6],[0,0,0,7],[0,0,0,8],[0,0,0,9],[0,0,0,10],[0,0,0,11],[0,0,0,12]], dataset.table.ttl=2592000, dataset.table.readless.increment=true}
08-Apr-2018 00:02:11 2018-04-08 00:02:11,829 - INFO  [netty-executor-9:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.metrics.v2.table.ts.60, type meta: DatasetTypeMeta{name='co.cask.cdap.data2.dataset2.lib.table.MetricsTable', modules=[DatasetModuleMeta{name='metricsTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryMetricsTableModule', jarLocation=null, types=[co.cask.cdap.data2.dataset2.lib.table.inmemory.InMemoryMetricsTable, co.cask.cdap.data2.dataset2.lib.table.MetricsTable], usesModules=[], usedByModules=[cube]}]}, props: DatasetProperties{properties={hbase.splits=[[0,0,0,2],[0,0,0,3],[0,0,0,4],[0,0,0,5],[0,0,0,6],[0,0,0,7],[0,0,0,8],[0,0,0,9],[0,0,0,10],[0,0,0,11],[0,0,0,12]], dataset.table.ttl=2592000, dataset.table.readless.increment=true}}
08-Apr-2018 00:02:11 2018-04-08 00:02:11,852 - INFO  [metrics-cleanup:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.60
08-Apr-2018 00:02:11 2018-04-08 00:02:11,886 - INFO  [netty-executor-7:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.metrics.v2.table.ts.3600, type name: co.cask.cdap.data2.dataset2.lib.table.MetricsTable, properties: {hbase.splits=[[0,0,0,2],[0,0,0,3],[0,0,0,4],[0,0,0,5],[0,0,0,6],[0,0,0,7],[0,0,0,8],[0,0,0,9],[0,0,0,10],[0,0,0,11],[0,0,0,12]], dataset.table.ttl=2592000, dataset.table.readless.increment=true}
08-Apr-2018 00:02:11 2018-04-08 00:02:11,891 - INFO  [netty-executor-5:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.metrics.v2.table.ts.3600, type meta: DatasetTypeMeta{name='co.cask.cdap.data2.dataset2.lib.table.MetricsTable', modules=[DatasetModuleMeta{name='metricsTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryMetricsTableModule', jarLocation=null, types=[co.cask.cdap.data2.dataset2.lib.table.inmemory.InMemoryMetricsTable, co.cask.cdap.data2.dataset2.lib.table.MetricsTable], usesModules=[], usedByModules=[cube]}]}, props: DatasetProperties{properties={hbase.splits=[[0,0,0,2],[0,0,0,3],[0,0,0,4],[0,0,0,5],[0,0,0,6],[0,0,0,7],[0,0,0,8],[0,0,0,9],[0,0,0,10],[0,0,0,11],[0,0,0,12]], dataset.table.ttl=2592000, dataset.table.readless.increment=true}}
08-Apr-2018 00:02:11 2018-04-08 00:02:11,910 - INFO  [metrics-cleanup:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.3600
08-Apr-2018 00:02:11 2018-04-08 00:02:11,944 - INFO  [netty-executor-1:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.metrics.v2.table.ts.2147483647, type name: co.cask.cdap.data2.dataset2.lib.table.MetricsTable, properties: {hbase.splits=[[0,0,0,2],[0,0,0,3],[0,0,0,4],[0,0,0,5],[0,0,0,6],[0,0,0,7],[0,0,0,8],[0,0,0,9],[0,0,0,10],[0,0,0,11],[0,0,0,12]], dataset.table.readless.increment=true}
08-Apr-2018 00:02:11 2018-04-08 00:02:11,950 - INFO  [netty-executor-2:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.metrics.v2.table.ts.2147483647, type meta: DatasetTypeMeta{name='co.cask.cdap.data2.dataset2.lib.table.MetricsTable', modules=[DatasetModuleMeta{name='metricsTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryMetricsTableModule', jarLocation=null, types=[co.cask.cdap.data2.dataset2.lib.table.inmemory.InMemoryMetricsTable, co.cask.cdap.data2.dataset2.lib.table.MetricsTable], usesModules=[], usedByModules=[cube]}]}, props: DatasetProperties{properties={hbase.splits=[[0,0,0,2],[0,0,0,3],[0,0,0,4],[0,0,0,5],[0,0,0,6],[0,0,0,7],[0,0,0,8],[0,0,0,9],[0,0,0,10],[0,0,0,11],[0,0,0,12]], dataset.table.readless.increment=true}}
08-Apr-2018 00:02:11 2018-04-08 00:02:11,970 - INFO  [metrics-cleanup:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.2147483647
08-Apr-2018 00:02:12 2018-04-08 00:02:12,235 - WARN  [Hive14ExploreService STARTING:o.a.h.u.NativeCodeLoader@62] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
08-Apr-2018 00:02:12 2018-04-08 00:02:12,501 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@575] - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
08-Apr-2018 00:02:12 2018-04-08 00:02:12,543 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.ObjectStore@269] - ObjectStore, initialize called
08-Apr-2018 00:02:12 log4j:WARN No appenders could be found for logger (DataNucleus.General).
08-Apr-2018 00:02:12 log4j:WARN Please initialize the log4j system properly.
08-Apr-2018 00:02:12 log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
08-Apr-2018 00:02:15 2018-04-08 00:02:15,510 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.ObjectStore@350] - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
08-Apr-2018 00:02:19 2018-04-08 00:02:19,456 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.MetaStoreDirectSql@132] - Using direct SQL, underlying DB is DERBY
08-Apr-2018 00:02:19 2018-04-08 00:02:19,457 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.ObjectStore@252] - Initialized ObjectStore
08-Apr-2018 00:02:19 2018-04-08 00:02:19,620 - WARN  [Hive14ExploreService STARTING:o.a.h.h.m.ObjectStore@6599] - Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.1.0
08-Apr-2018 00:02:19 2018-04-08 00:02:19,819 - WARN  [Hive14ExploreService STARTING:o.a.h.h.m.ObjectStore@548] - Failed to get database default, returning NoSuchObjectException
08-Apr-2018 00:02:20 2018-04-08 00:02:20,193 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@649] - Added admin role in metastore
08-Apr-2018 00:02:20 2018-04-08 00:02:20,198 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@658] - Added public role in metastore
08-Apr-2018 00:02:20 2018-04-08 00:02:20,310 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@698] - No user is added in admin role, since config is empty
08-Apr-2018 00:02:20 2018-04-08 00:02:20,492 - INFO  [Hive14ExploreService STARTING:o.a.h.c.Configuration@996] - dfs.umaskmode is deprecated. Instead, use fs.permissions.umask-mode
08-Apr-2018 00:02:20 2018-04-08 00:02:20,516 - INFO  [Hive14ExploreService STARTING:o.a.h.h.q.s.SessionState@586] - Created HDFS directory: /tmp/CG-DCCP5-JOB1/junit6402254617002480524/hive/tmp/bamboo
08-Apr-2018 00:02:20 2018-04-08 00:02:20,535 - INFO  [Hive14ExploreService STARTING:o.a.h.h.q.s.SessionState@586] - Created local directory: /tmp/CG-DCCP5-JOB1/0674044c-805d-44c2-8426-d37f96dbfaf1_resources
08-Apr-2018 00:02:20 2018-04-08 00:02:20,541 - INFO  [Hive14ExploreService STARTING:o.a.h.h.q.s.SessionState@586] - Created HDFS directory: /tmp/CG-DCCP5-JOB1/junit6402254617002480524/hive/tmp/bamboo/0674044c-805d-44c2-8426-d37f96dbfaf1
08-Apr-2018 00:02:20 2018-04-08 00:02:20,565 - INFO  [Hive14ExploreService STARTING:o.a.h.h.q.s.SessionState@586] - Created local directory: /tmp/CG-DCCP5-JOB1/bamboo/0674044c-805d-44c2-8426-d37f96dbfaf1
08-Apr-2018 00:02:20 2018-04-08 00:02:20,571 - INFO  [Hive14ExploreService STARTING:o.a.h.h.q.s.SessionState@586] - Created HDFS directory: /tmp/CG-DCCP5-JOB1/junit6402254617002480524/hive/tmp/bamboo/0674044c-805d-44c2-8426-d37f96dbfaf1/_tmp_space.db
08-Apr-2018 00:02:20 2018-04-08 00:02:20,572 - INFO  [Hive14ExploreService STARTING:o.a.h.h.q.s.SessionState@488] - No Tez session required at this point. hive.execution.engine=mr.
08-Apr-2018 00:02:20 2018-04-08 00:02:20,872 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.ObjectStore@269] - ObjectStore, initialize called
08-Apr-2018 00:02:20 2018-04-08 00:02:20,887 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.MetaStoreDirectSql@132] - Using direct SQL, underlying DB is DERBY
08-Apr-2018 00:02:20 2018-04-08 00:02:20,887 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.ObjectStore@252] - Initialized ObjectStore
08-Apr-2018 00:02:20 2018-04-08 00:02:20,888 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 0: get_databases: default
08-Apr-2018 00:02:20 2018-04-08 00:02:20,889 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=get_databases: default        
08-Apr-2018 00:02:20 2018-04-08 00:02:20,910 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 0: Shutting down the object store...
08-Apr-2018 00:02:20 2018-04-08 00:02:20,911 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=Shutting down the object store...        
08-Apr-2018 00:02:20 2018-04-08 00:02:20,911 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 0: Metastore shutdown complete.
08-Apr-2018 00:02:20 2018-04-08 00:02:20,912 - INFO  [Hive14ExploreService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=Metastore shutdown complete.        
08-Apr-2018 00:02:20 2018-04-08 00:02:20,980 - INFO  [ExploreExecutorService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 1: get_all_tables: db=default
08-Apr-2018 00:02:20 2018-04-08 00:02:20,981 - INFO  [ExploreExecutorService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=get_all_tables: db=default        
08-Apr-2018 00:02:20 2018-04-08 00:02:20,982 - INFO  [ExploreExecutorService STARTING:o.a.h.h.m.HiveMetaStore$HMSHandler@575] - 1: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
08-Apr-2018 00:02:20 2018-04-08 00:02:20,985 - INFO  [ExploreExecutorService STARTING:o.a.h.h.m.ObjectStore@269] - ObjectStore, initialize called
08-Apr-2018 00:02:21 2018-04-08 00:02:21,001 - INFO  [ExploreExecutorService STARTING:o.a.h.h.m.MetaStoreDirectSql@132] - Using direct SQL, underlying DB is DERBY
08-Apr-2018 00:02:21 2018-04-08 00:02:21,002 - INFO  [ExploreExecutorService STARTING:o.a.h.h.m.ObjectStore@252] - Initialized ObjectStore
08-Apr-2018 00:02:21 2018-04-08 00:02:21,044 - INFO  [ExploreExecutorService STARTING:c.c.c.e.e.ExploreExecutorService@109] - ExploreExecutorService started successfully on /127.0.0.1:45153
08-Apr-2018 00:02:21 2018-04-08 00:02:21,062 - WARN  [main:c.c.c.i.a.r.a.ArtifactRepository@104] - Ignoring /opt/cdap/master/artifacts because it is not a directory.
08-Apr-2018 00:02:21 2018-04-08 00:02:21,063 - WARN  [main:c.c.c.i.a.r.a.ArtifactRepository@104] - Ignoring /opt/cdap/master/artifacts because it is not a directory.
08-Apr-2018 00:02:21 2018-04-08 00:02:21,064 - WARN  [main:c.c.c.i.a.r.a.ArtifactRepository@104] - Ignoring /opt/cdap/master/artifacts because it is not a directory.
08-Apr-2018 00:02:21 2018-04-08 00:02:21,189 - WARN  [main:c.c.c.i.a.r.a.ArtifactRepository@104] - Ignoring /opt/cdap/master/artifacts because it is not a directory.
08-Apr-2018 00:02:22 2018-04-08 00:02:22,433 - INFO  [netty-executor-1:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.app.meta, type name: table, properties: {}
08-Apr-2018 00:02:22 2018-04-08 00:02:22,436 - INFO  [netty-executor-8:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.app.meta, type meta: DatasetTypeMeta{name='table', modules=[DatasetModuleMeta{name='orderedTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryTableModule', jarLocation=null, types=[table, co.cask.cdap.api.dataset.table.Table], usesModules=[], usedByModules=[core, objectMappedTable, cube, usage, lineage]}]}, props: DatasetProperties{properties={}}
08-Apr-2018 00:02:22 2018-04-08 00:02:22,844 - INFO  [netty-executor-7:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.system.metadata, type name: co.cask.cdap.data2.metadata.dataset.MetadataDataset, properties: {}
08-Apr-2018 00:02:22 2018-04-08 00:02:22,853 - INFO  [netty-executor-5:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.system.metadata, type meta: DatasetTypeMeta{name='co.cask.cdap.data2.metadata.dataset.MetadataDataset', modules=[DatasetModuleMeta{name='orderedTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryTableModule', jarLocation=null, types=[table, co.cask.cdap.api.dataset.table.Table], usesModules=[], usedByModules=[core, objectMappedTable, cube, usage, lineage]}, DatasetModuleMeta{name='core', className='co.cask.cdap.data2.dataset2.lib.table.CoreDatasetsModule', jarLocation=null, types=[keyValueTable, co.cask.cdap.api.dataset.lib.KeyValueTable, objectStore, co.cask.cdap.api.dataset.lib.ObjectStore, indexedObjectStore, co.cask.cdap.api.dataset.lib.IndexedObjectStore, indexedTable, co.cask.cdap.api.dataset.lib.IndexedTable, timeseriesTable, co.cask.cdap.api.dataset.lib.TimeseriesTable, counterTimeseriesTable, co.cask.cdap.api.dataset.lib.CounterTimeseriesTable, co.cask.cdap.api.dataset.table.MemoryTable, memoryTable], usesModules=[orderedTable-memory], usedByModules=[timePartitionedFileSet, partitionedFileSet, metadata]}, DatasetModuleMeta{name='metadata', className='co.cask.cdap.data2.metadata.dataset.MetadataDatasetModule', jarLocation=null, types=[metadataDataset, co.cask.cdap.data2.metadata.dataset.MetadataDataset], usesModules=[orderedTable-memory, core], usedByModules=[]}]}, props: DatasetProperties{properties={}}
08-Apr-2018 00:02:23 2018-04-08 00:02:23,002 - INFO  [netty-executor-7:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.artifact.meta, type name: co.cask.cdap.api.dataset.table.Table, properties: {conflict.level=COLUMN}
08-Apr-2018 00:02:23 2018-04-08 00:02:23,008 - INFO  [netty-executor-2:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.artifact.meta, type meta: DatasetTypeMeta{name='co.cask.cdap.api.dataset.table.Table', modules=[DatasetModuleMeta{name='orderedTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryTableModule', jarLocation=null, types=[table, co.cask.cdap.api.dataset.table.Table], usesModules=[], usedByModules=[core, objectMappedTable, cube, usage, lineage]}]}, props: DatasetProperties{properties={conflict.level=COLUMN}}
08-Apr-2018 00:02:23 2018-04-08 00:02:23,042 - WARN  [main:c.c.c.i.a.r.a.ArtifactRepository@104] - Ignoring /opt/cdap/master/artifacts because it is not a directory.
08-Apr-2018 00:02:23 2018-04-08 00:02:23,086 - WARN  [main:c.c.c.i.a.r.a.ArtifactRepository@104] - Ignoring /opt/cdap/master/artifacts because it is not a directory.
08-Apr-2018 00:02:23 2018-04-08 00:02:23,398 - INFO  [main:c.c.c.i.a.d.p.DatasetInstanceCreator@60] - Adding instance: weblogsCube
08-Apr-2018 00:02:23 2018-04-08 00:02:23,409 - INFO  [netty-executor-8:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset default.weblogsCube, type name: co.cask.cdap.api.dataset.lib.cube.Cube, properties: {dataset.cube.resolutions=1,60,3600, dataset.cube.aggregation.agg1.dimensions=response_status, dataset.cube.aggregation.agg2.dimensions=ip,browser}
08-Apr-2018 00:02:23 2018-04-08 00:02:23,413 - INFO  [netty-executor-0:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:default.weblogsCube, type meta: DatasetTypeMeta{name='co.cask.cdap.api.dataset.lib.cube.Cube', modules=[DatasetModuleMeta{name='orderedTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryTableModule', jarLocation=null, types=[table, co.cask.cdap.api.dataset.table.Table], usesModules=[], usedByModules=[core, objectMappedTable, cube, usage, lineage]}, DatasetModuleMeta{name='metricsTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryMetricsTableModule', jarLocation=null, types=[co.cask.cdap.data2.dataset2.lib.table.inmemory.InMemoryMetricsTable, co.cask.cdap.data2.dataset2.lib.table.MetricsTable], usesModules=[], usedByModules=[cube]}, DatasetModuleMeta{name='cube', className='co.cask.cdap.data2.dataset2.lib.table.CubeModule', jarLocation=null, types=[co.cask.cdap.api.dataset.lib.cube.Cube, cube], usesModules=[orderedTable-memory, metricsTable-memory], usedByModules=[]}]}, props: DatasetProperties{properties={dataset.cube.resolutions=1,60,3600, dataset.cube.aggregation.agg1.dimensions=response_status, dataset.cube.aggregation.agg2.dimensions=ip,browser}}
08-Apr-2018 00:02:23 2018-04-08 00:02:23,642 - WARN  [netty-executor-1:c.c.c.c.c.Configuration@570] - security.auth.server.address is deprecated. Instead, use security.auth.server.bind.address
08-Apr-2018 00:02:24 2018-04-08 00:02:24,394 - INFO  [netty-executor-1:o.a.h.h.q.s.SessionState@586] - Created local directory: /tmp/CG-DCCP5-JOB1/44aa83f1-e96b-4f74-b2ec-d44fae7f644d_resources
08-Apr-2018 00:02:24 2018-04-08 00:02:24,399 - INFO  [netty-executor-1:o.a.h.h.q.s.SessionState@586] - Created HDFS directory: /tmp/CG-DCCP5-JOB1/junit6402254617002480524/hive/tmp/bamboo/44aa83f1-e96b-4f74-b2ec-d44fae7f644d
08-Apr-2018 00:02:24 2018-04-08 00:02:24,405 - INFO  [netty-executor-1:o.a.h.h.q.s.SessionState@586] - Created local directory: /tmp/CG-DCCP5-JOB1/bamboo/44aa83f1-e96b-4f74-b2ec-d44fae7f644d
08-Apr-2018 00:02:24 2018-04-08 00:02:24,410 - INFO  [netty-executor-1:o.a.h.h.q.s.SessionState@586] - Created HDFS directory: /tmp/CG-DCCP5-JOB1/junit6402254617002480524/hive/tmp/bamboo/44aa83f1-e96b-4f74-b2ec-d44fae7f644d/_tmp_space.db
08-Apr-2018 00:02:24 2018-04-08 00:02:24,411 - INFO  [netty-executor-1:o.a.h.h.q.s.SessionState@488] - No Tez session required at this point. hive.execution.engine=mr.
08-Apr-2018 00:02:24 2018-04-08 00:02:24,446 - INFO  [netty-executor-1:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:24 2018-04-08 00:02:24,491 - INFO  [netty-executor-1:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:24 2018-04-08 00:02:24,876 - INFO  [netty-executor-1:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=parse start=1523145744491 end=1523145744876 duration=385 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:24 2018-04-08 00:02:24,881 - INFO  [netty-executor-1:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:24 2018-04-08 00:02:24,967 - INFO  [netty-executor-1:o.a.h.h.q.p.SemanticAnalyzer@10146] - Starting Semantic Analysis
08-Apr-2018 00:02:24 2018-04-08 00:02:24,983 - INFO  [netty-executor-1:o.a.h.h.q.p.SemanticAnalyzer@10773] - Creating table default.stream_weblogs position=36
08-Apr-2018 00:02:24 2018-04-08 00:02:24,993 - INFO  [netty-executor-1:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 2: get_table : db=default tbl=stream_weblogs
08-Apr-2018 00:02:24 2018-04-08 00:02:24,993 - INFO  [netty-executor-1:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=get_table : db=default tbl=stream_weblogs        
08-Apr-2018 00:02:24 2018-04-08 00:02:24,994 - INFO  [netty-executor-1:o.a.h.h.m.HiveMetaStore$HMSHandler@575] - 2: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
08-Apr-2018 00:02:24 2018-04-08 00:02:24,996 - INFO  [netty-executor-1:o.a.h.h.m.ObjectStore@269] - ObjectStore, initialize called
08-Apr-2018 00:02:25 2018-04-08 00:02:25,005 - INFO  [netty-executor-1:o.a.h.h.m.MetaStoreDirectSql@132] - Using direct SQL, underlying DB is DERBY
08-Apr-2018 00:02:25 2018-04-08 00:02:25,005 - INFO  [netty-executor-1:o.a.h.h.m.ObjectStore@252] - Initialized ObjectStore
08-Apr-2018 00:02:25 2018-04-08 00:02:25,037 - INFO  [netty-executor-1:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 2: get_database: default
08-Apr-2018 00:02:25 2018-04-08 00:02:25,038 - INFO  [netty-executor-1:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=get_database: default        
08-Apr-2018 00:02:25 2018-04-08 00:02:25,091 - INFO  [netty-executor-1:o.a.h.h.q.Driver@433] - Semantic Analysis Completed
08-Apr-2018 00:02:25 2018-04-08 00:02:25,092 - INFO  [netty-executor-1:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=semanticAnalyze start=1523145744881 end=1523145745092 duration=211 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,103 - INFO  [netty-executor-1:o.a.h.h.q.Driver@239] - Returning Hive schema: Schema(fieldSchemas:null, properties:null)
08-Apr-2018 00:02:25 2018-04-08 00:02:25,104 - INFO  [netty-executor-1:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=compile start=1523145744446 end=1523145745104 duration=658 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,110 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,111 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,111 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.Driver@159] - Concurrency mode is disabled, not creating a lock manager
08-Apr-2018 00:02:25 2018-04-08 00:02:25,112 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,112 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.Driver@1317] - Starting command: CREATE EXTERNAL TABLE IF NOT EXISTS stream_weblogs (ts bigint, headers map<string,string>, body string) COMMENT 'CDAP Stream' STORED BY 'co.cask.cdap.hive.stream.StreamStorageHandler' WITH SERDEPROPERTIES ('explore.stream.name'='weblogs', 'explore.stream.namespace'='default', 'explore.format.specification'='{"name":"text","schema":{"type":"record","name":"stringBody","fields":[{"name":"body","type":"string"}]},"settings":{}}') TBLPROPERTIES ('cdap.name'='weblogs', 'cdap.version'='3.4.1-1463051886235')
08-Apr-2018 00:02:25 2018-04-08 00:02:25,123 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=TimeToSubmit start=1523145745111 end=1523145745122 duration=11 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,123 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,123 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=task.DDL.Stage-0 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,131 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.Driver@1636] - Starting task [Stage-0:DDL] in serial mode
08-Apr-2018 00:02:25 2018-04-08 00:02:25,174 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 3: create_table: Table(tableName:stream_weblogs, dbName:default, owner:bamboo, createTime:1523145745, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:ts, type:bigint, comment:null), FieldSchema(name:headers, type:map<string,string>, comment:null), FieldSchema(name:body, type:string, comment:null)], location:null, inputFormat:null, outputFormat:null, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:co.cask.cdap.hive.stream.StreamSerDe, parameters:{explore.stream.name=weblogs, explore.stream.namespace=default, explore.format.specification={"name":"text","schema":{"type":"record","name":"stringBody","fields":[{"name":"body","type":"string"}]},"settings":{}}, serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{}), storedAsSubDirectories:false), partitionKeys:[], parameters:{cdap.name=weblogs, EXTERNAL=TRUE, cdap.version=3.4.1-1463051886235, comment=CDAP Stream, storage_handler=co.cask.cdap.hive.stream.StreamStorageHandler}, viewOriginalText:null, viewExpandedText:null, tableType:EXTERNAL_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{}, groupPrivileges:null, rolePrivileges:null), temporary:false)
08-Apr-2018 00:02:25 2018-04-08 00:02:25,175 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=create_table: Table(tableName:stream_weblogs, dbName:default, owner:bamboo, createTime:1523145745, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:ts, type:bigint, comment:null), FieldSchema(name:headers, type:map<string,string>, comment:null), FieldSchema(name:body, type:string, comment:null)], location:null, inputFormat:null, outputFormat:null, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:co.cask.cdap.hive.stream.StreamSerDe, parameters:{explore.stream.name=weblogs, explore.stream.namespace=default, explore.format.specification={"name":"text","schema":{"type":"record","name":"stringBody","fields":[{"name":"body","type":"string"}]},"settings":{}}, serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{}), storedAsSubDirectories:false), partitionKeys:[], parameters:{cdap.name=weblogs, EXTERNAL=TRUE, cdap.version=3.4.1-1463051886235, comment=CDAP Stream, storage_handler=co.cask.cdap.hive.stream.StreamStorageHandler}, viewOriginalText:null, viewExpandedText:null, tableType:EXTERNAL_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{}, groupPrivileges:null, rolePrivileges:null), temporary:false)        
08-Apr-2018 00:02:25 2018-04-08 00:02:25,175 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.m.HiveMetaStore$HMSHandler@575] - 3: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
08-Apr-2018 00:02:25 2018-04-08 00:02:25,181 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.m.ObjectStore@269] - ObjectStore, initialize called
08-Apr-2018 00:02:25 2018-04-08 00:02:25,190 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.m.MetaStoreDirectSql@132] - Using direct SQL, underlying DB is DERBY
08-Apr-2018 00:02:25 2018-04-08 00:02:25,190 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.m.ObjectStore@252] - Initialized ObjectStore
08-Apr-2018 00:02:25 2018-04-08 00:02:25,205 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.c.FileUtils@501] - Creating directory if it doesn't exist: file:/tmp/CG-DCCP5-JOB1/junit6402254617002480524/hive/warehouse/1523145731211/stream_weblogs
08-Apr-2018 00:02:25 2018-04-08 00:02:25,494 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=runTasks start=1523145745123 end=1523145745494 duration=371 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,495 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=Driver.execute start=1523145745112 end=1523145745495 duration=383 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,497 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.s.SessionState$LogHelper@852] - OK
08-Apr-2018 00:02:25 2018-04-08 00:02:25,497 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,497 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=releaseLocks start=1523145745497 end=1523145745497 duration=0 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,498 - INFO  [HiveServer2-Background-Pool: Thread-145:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=Driver.run start=1523145745110 end=1523145745498 duration=388 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,506 - INFO  [explore-handle-timeout:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,507 - INFO  [explore-handle-timeout:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=releaseLocks start=1523145745506 end=1523145745507 duration=1 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:25 2018-04-08 00:02:25,699 - INFO  [netty-executor-7:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.usage.registry, type name: UsageDataset, properties: {}
08-Apr-2018 00:02:25 2018-04-08 00:02:25,702 - INFO  [netty-executor-4:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.usage.registry, type meta: DatasetTypeMeta{name='UsageDataset', modules=[DatasetModuleMeta{name='orderedTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryTableModule', jarLocation=null, types=[table, co.cask.cdap.api.dataset.table.Table], usesModules=[], usedByModules=[core, objectMappedTable, cube, usage, lineage]}, DatasetModuleMeta{name='usage', className='co.cask.cdap.data2.registry.UsageDatasetModule', jarLocation=null, types=[UsageDataset], usesModules=[orderedTable-memory], usedByModules=[]}]}, props: DatasetProperties{properties={}}
08-Apr-2018 00:02:25 2018-04-08 00:02:25,939 - INFO  [netty-executor-2:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.config.store.table, type name: table, properties: {}
08-Apr-2018 00:02:25 2018-04-08 00:02:25,942 - INFO  [netty-executor-1:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.config.store.table, type meta: DatasetTypeMeta{name='table', modules=[DatasetModuleMeta{name='orderedTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryTableModule', jarLocation=null, types=[table, co.cask.cdap.api.dataset.table.Table], usesModules=[], usedByModules=[core, objectMappedTable, cube, usage, lineage]}]}, props: DatasetProperties{properties={}}
08-Apr-2018 00:02:26 2018-04-08 00:02:26,062 - INFO  [main:c.c.c.i.a.r.f.FlowUtils@171] - Queue config for stream:///default/weblogs : [ConsumerGroupConfig{groupId=2882142620232156818, groupSize=1, dequeueStrategy=FIFO, hashKey=null}]
08-Apr-2018 00:02:26 2018-04-08 00:02:26,063 - INFO  [main:c.c.c.d.t.s.FileStreamAdmin@193] - Configure groups for stream:default.weblogs: {2882142620232156818=1}
08-Apr-2018 00:02:26 2018-04-08 00:02:26,176 - INFO  [main:c.c.c.d.t.s.FileStreamAdmin@225] - Configure groups new states: {2882142620232156818=1} [StreamConsumerState{groupId=2882142620232156818, instanceId=0, states=[]}]
08-Apr-2018 00:02:26 2018-04-08 00:02:26,182 - INFO  [main:c.c.c.c.l.LogCollector@72] - Root directory for log collection is /tmp/CG-DCCP5-JOB1/junit6402254617002480524/junit4182421263746750264/logs
08-Apr-2018 00:02:26 2018-04-08 00:02:26,195 - INFO  [netty-executor-7:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.lineage, type name: co.cask.cdap.data2.metadata.lineage.LineageDataset, properties: {}
08-Apr-2018 00:02:26 2018-04-08 00:02:26,198 - INFO  [netty-executor-0:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.lineage, type meta: DatasetTypeMeta{name='co.cask.cdap.data2.metadata.lineage.LineageDataset', modules=[DatasetModuleMeta{name='orderedTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryTableModule', jarLocation=null, types=[table, co.cask.cdap.api.dataset.table.Table], usesModules=[], usedByModules=[core, objectMappedTable, cube, usage, lineage]}, DatasetModuleMeta{name='lineage', className='co.cask.cdap.data2.metadata.lineage.LineageDatasetModule', jarLocation=null, types=[lineageDataset, co.cask.cdap.data2.metadata.lineage.LineageDataset], usesModules=[orderedTable-memory], usedByModules=[]}]}, props: DatasetProperties{properties={}}
08-Apr-2018 00:02:26 2018-04-08 00:02:26,338 - INFO  [main:c.c.c.d.t.s.AbstractStreamFileConsumerFactory@207] - Empty consumer state. Create file reader with file offsets: groupId=2882142620232156818, instanceId=0 states=[StreamFileOffset{event=/tmp/CG-DCCP5-JOB1/junit6402254617002480524/junit4182421263746750264/namespaces/default/streams/weblogs/1523142000.03600/file.0.000000.dat, offset=0}]
08-Apr-2018 00:02:26 2018-04-08 00:02:26,373 - INFO  [main:c.c.c.d.t.s.AbstractStreamFileConsumer@180] - Create consumer ConsumerConfig{groupId=2882142620232156818, instanceId=0, groupSize=1, dequeueStrategy=FIFO, hashKey=null}, reader offsets: [StreamFileOffset{event=/tmp/CG-DCCP5-JOB1/junit6402254617002480524/junit4182421263746750264/namespaces/default/streams/weblogs/1523142000.03600/file.0.000000.dat, offset=0}]
08-Apr-2018 00:02:26 2018-04-08 00:02:26,407 - INFO  [main:c.c.c.i.a.r.f.FlowletProgramRunner@274] - Starting flowlet: flowlet=writer, instance=0, groupsize=1, namespaceId=default, applicationId=WebAnalyticsApp, program=CubeWriterFlow, runid=1ed35ba1-3ac0-11e8-a7fc-000000feccb5
08-Apr-2018 00:02:26 2018-04-08 00:02:26,408 - INFO  [main:c.c.c.i.a.r.f.FlowletProgramRunner@276] - Flowlet started: flowlet=writer, instance=0, groupsize=1, namespaceId=default, applicationId=WebAnalyticsApp, program=CubeWriterFlow, runid=1ed35ba1-3ac0-11e8-a7fc-000000feccb5
08-Apr-2018 00:02:26 2018-04-08 00:02:26,806 - INFO  [FlowletRuntimeService STARTING:c.c.c.i.a.r.f.FlowletRuntimeService$1@112] - Initializing flowlet: flowlet=writer, instance=0, groupsize=1, namespaceId=default, applicationId=WebAnalyticsApp, program=CubeWriterFlow, runid=1ed35ba1-3ac0-11e8-a7fc-000000feccb5
08-Apr-2018 00:02:26 2018-04-08 00:02:26,821 - INFO  [FlowletRuntimeService STARTING:c.c.c.i.a.r.f.FlowletRuntimeService$1@119] - Flowlet initialized: flowlet=writer, instance=0, groupsize=1, namespaceId=default, applicationId=WebAnalyticsApp, program=CubeWriterFlow, runid=1ed35ba1-3ac0-11e8-a7fc-000000feccb5
08-Apr-2018 00:02:26 2018-04-08 00:02:26,850 - INFO  [main:c.c.c.d.s.s.ConcurrentStreamWriter$StreamFileFactory@323] - Create stream writer for stream:default.weblogs with generation 0
08-Apr-2018 00:02:26 2018-04-08 00:02:26,875 - INFO  [main:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.1
08-Apr-2018 00:02:26 2018-04-08 00:02:26,890 - INFO  [main:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.60
08-Apr-2018 00:02:26 2018-04-08 00:02:26,905 - INFO  [main:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.3600
08-Apr-2018 00:02:26 2018-04-08 00:02:26,917 - INFO  [main:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.2147483647
08-Apr-2018 00:02:28 2018-04-08 00:02:28,466 - INFO  [ServiceHttpServer STARTING:c.c.c.i.a.s.ServiceHttpServer@239] - Announced HTTP Service for Service program:default.WebAnalyticsApp.service.CubeService at /127.0.0.1:37620
08-Apr-2018 00:02:28 2018-04-08 00:02:28,580 - INFO  [pcontroller-program:default.WebAnalyticsApp.service.CubeService-2042f133-3ac0-11e8-abdf-000000dbb2a7:c.c.c.i.a.AbstractInMemoryProgramRunner$InMemoryProgramController@167] - Stopping Program: CubeService
08-Apr-2018 00:02:28 2018-04-08 00:02:28,601 - INFO  [pcontroller-program:default.WebAnalyticsApp.service.CubeService-2042f133-3ac0-11e8-abdf-000000dbb2a7:c.c.c.i.a.AbstractInMemoryProgramRunner$InMemoryProgramController@180] - Program stopped: CubeService
08-Apr-2018 00:02:28 2018-04-08 00:02:28,613 - INFO  [pcontroller-program:default.WebAnalyticsApp.flow.CubeWriterFlow-1ed35ba1-3ac0-11e8-a7fc-000000feccb5:c.c.c.i.a.r.f.FlowProgramRunner$FlowProgramController@225] - Stopping flow: CubeWriterFlow
08-Apr-2018 00:02:28 2018-04-08 00:02:28,615 - INFO  [pcontroller-program:default.WebAnalyticsApp.flow.CubeWriterFlow-writer-1ed35ba1-3ac0-11e8-a7fc-000000feccb5:c.c.c.i.a.r.f.FlowletProgramController@90] - Stopping flowlet: flowlet=writer, instance=0, groupsize=1, namespaceId=default, applicationId=WebAnalyticsApp, program=CubeWriterFlow, runid=1ed35ba1-3ac0-11e8-a7fc-000000feccb5
08-Apr-2018 00:02:28 2018-04-08 00:02:28,617 - INFO  [FlowletRuntimeService STOPPING:c.c.c.i.a.r.f.FlowletRuntimeService$2@134] - Destroying flowlet: flowlet=writer, instance=0, groupsize=1, namespaceId=default, applicationId=WebAnalyticsApp, program=CubeWriterFlow, runid=1ed35ba1-3ac0-11e8-a7fc-000000feccb5
08-Apr-2018 00:02:28 2018-04-08 00:02:28,618 - INFO  [FlowletRuntimeService STOPPING:c.c.c.i.a.r.f.FlowletRuntimeService$2@141] - Flowlet destroyed: flowlet=writer, instance=0, groupsize=1, namespaceId=default, applicationId=WebAnalyticsApp, program=CubeWriterFlow, runid=1ed35ba1-3ac0-11e8-a7fc-000000feccb5
08-Apr-2018 00:02:28 2018-04-08 00:02:28,620 - INFO  [pcontroller-program:default.WebAnalyticsApp.flow.CubeWriterFlow-writer-1ed35ba1-3ac0-11e8-a7fc-000000feccb5:c.c.c.i.a.r.f.FlowletProgramController@100] - Flowlet stopped: flowlet=writer, instance=0, groupsize=1, namespaceId=default, applicationId=WebAnalyticsApp, program=CubeWriterFlow, runid=1ed35ba1-3ac0-11e8-a7fc-000000feccb5
08-Apr-2018 00:02:28 2018-04-08 00:02:28,621 - INFO  [pcontroller-program:default.WebAnalyticsApp.flow.CubeWriterFlow-1ed35ba1-3ac0-11e8-a7fc-000000feccb5:c.c.c.i.a.r.f.FlowProgramRunner$FlowProgramController@239] - Flow stopped: CubeWriterFlow
08-Apr-2018 00:02:28 2018-04-08 00:02:28,628 - INFO  [main:c.c.c.i.a.n.DefaultNamespaceAdmin@216] - Deleting namespace 'namespace:default'.
08-Apr-2018 00:02:28 2018-04-08 00:02:28,692 - INFO  [main:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.1
08-Apr-2018 00:02:28 2018-04-08 00:02:28,715 - INFO  [main:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.60
08-Apr-2018 00:02:28 2018-04-08 00:02:28,729 - INFO  [main:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.3600
08-Apr-2018 00:02:28 2018-04-08 00:02:28,743 - INFO  [main:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.2147483647
08-Apr-2018 00:02:28 2018-04-08 00:02:28,797 - INFO  [main:c.c.c.d.t.s.FileStreamAdmin@193] - Configure groups for stream:default.weblogs: {2882142620232156818=0}
08-Apr-2018 00:02:28 2018-04-08 00:02:28,806 - INFO  [main:c.c.c.d.t.s.FileStreamAdmin@229] - Configure groups remove states: {2882142620232156818=0} [StreamConsumerState{groupId=2882142620232156818, instanceId=0, states=[StreamFileOffset{event=/tmp/CG-DCCP5-JOB1/junit6402254617002480524/junit4182421263746750264/namespaces/default/streams/weblogs/1523145600.03600/file.0.000000.dat, offset=801}]}]
08-Apr-2018 00:02:28 2018-04-08 00:02:28,829 - INFO  [netty-executor-8:c.c.c.d.d.d.s.DatasetInstanceService@208] - Creating dataset system.business.metadata, type name: co.cask.cdap.data2.metadata.dataset.MetadataDataset, properties: {}
08-Apr-2018 00:02:28 2018-04-08 00:02:28,833 - INFO  [netty-executor-4:c.c.c.d.d.d.s.e.DatasetAdminService@86] - Creating dataset instance dataset:system.business.metadata, type meta: DatasetTypeMeta{name='co.cask.cdap.data2.metadata.dataset.MetadataDataset', modules=[DatasetModuleMeta{name='orderedTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryTableModule', jarLocation=null, types=[table, co.cask.cdap.api.dataset.table.Table], usesModules=[], usedByModules=[core, objectMappedTable, cube, usage, lineage]}, DatasetModuleMeta{name='core', className='co.cask.cdap.data2.dataset2.lib.table.CoreDatasetsModule', jarLocation=null, types=[keyValueTable, co.cask.cdap.api.dataset.lib.KeyValueTable, objectStore, co.cask.cdap.api.dataset.lib.ObjectStore, indexedObjectStore, co.cask.cdap.api.dataset.lib.IndexedObjectStore, indexedTable, co.cask.cdap.api.dataset.lib.IndexedTable, timeseriesTable, co.cask.cdap.api.dataset.lib.TimeseriesTable, counterTimeseriesTable, co.cask.cdap.api.dataset.lib.CounterTimeseriesTable, co.cask.cdap.api.dataset.table.MemoryTable, memoryTable], usesModules=[orderedTable-memory], usedByModules=[timePartitionedFileSet, partitionedFileSet, metadata]}, DatasetModuleMeta{name='metadata', className='co.cask.cdap.data2.metadata.dataset.MetadataDatasetModule', jarLocation=null, types=[metadataDataset, co.cask.cdap.data2.metadata.dataset.MetadataDataset], usesModules=[orderedTable-memory, core], usedByModules=[]}]}, props: DatasetProperties{properties={}}
08-Apr-2018 00:02:28 2018-04-08 00:02:28,984 - INFO  [netty-executor-7:c.c.c.d.d.d.s.DatasetInstanceService@301] - Deleting dataset default.weblogsCube
08-Apr-2018 00:02:29 2018-04-08 00:02:29,062 - INFO  [netty-executor-4:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 4: get_table : db=default tbl=dataset_weblogscube
08-Apr-2018 00:02:29 2018-04-08 00:02:29,063 - INFO  [netty-executor-4:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=get_table : db=default tbl=dataset_weblogscube        
08-Apr-2018 00:02:29 2018-04-08 00:02:29,064 - INFO  [netty-executor-4:o.a.h.h.m.HiveMetaStore$HMSHandler@575] - 4: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
08-Apr-2018 00:02:29 2018-04-08 00:02:29,065 - INFO  [netty-executor-4:o.a.h.h.m.ObjectStore@269] - ObjectStore, initialize called
08-Apr-2018 00:02:29 2018-04-08 00:02:29,079 - INFO  [netty-executor-4:o.a.h.h.m.MetaStoreDirectSql@132] - Using direct SQL, underlying DB is DERBY
08-Apr-2018 00:02:29 2018-04-08 00:02:29,080 - INFO  [netty-executor-4:o.a.h.h.m.ObjectStore@252] - Initialized ObjectStore
08-Apr-2018 00:02:29 2018-04-08 00:02:29,100 - INFO  [netty-executor-1:c.c.c.d.d.d.s.e.DatasetAdminService@155] - Dropping dataset with spec: DatasetSpecification{name='weblogsCube', type='co.cask.cdap.api.dataset.lib.cube.Cube', description='null', originalProperties={dataset.cube.aggregation.agg1.dimensions=response_status, dataset.cube.aggregation.agg2.dimensions=ip,browser, dataset.cube.resolutions=1,60,3600}, properties={dataset.cube.aggregation.agg1.dimensions=response_status, dataset.cube.aggregation.agg2.dimensions=ip,browser, dataset.cube.resolutions=1,60,3600}, datasetSpecs={1=DatasetSpecification{name='weblogsCube.1', type='table', description='null', originalProperties=null, properties={dataset.cube.aggregation.agg1.dimensions=response_status, dataset.cube.aggregation.agg2.dimensions=ip,browser, dataset.cube.resolutions=1,60,3600, hbase.splits=[[0,0,0,2]]}, datasetSpecs={}}, 3600=DatasetSpecification{name='weblogsCube.3600', type='table', description='null', originalProperties=null, properties={dataset.cube.aggregation.agg1.dimensions=response_status, dataset.cube.aggregation.agg2.dimensions=ip,browser, dataset.cube.resolutions=1,60,3600, hbase.splits=[[0,0,0,2]]}, datasetSpecs={}}, 60=DatasetSpecification{name='weblogsCube.60', type='table', description='null', originalProperties=null, properties={dataset.cube.aggregation.agg1.dimensions=response_status, dataset.cube.aggregation.agg2.dimensions=ip,browser, dataset.cube.resolutions=1,60,3600, hbase.splits=[[0,0,0,2]]}, datasetSpecs={}}, entity=DatasetSpecification{name='weblogsCube.entity', type='co.cask.cdap.data2.dataset2.lib.table.MetricsTable', description='null', originalProperties=null, properties={dataset.cube.aggregation.agg1.dimensions=response_status, dataset.cube.aggregation.agg2.dimensions=ip,browser, dataset.cube.resolutions=1,60,3600}, datasetSpecs={}}}}, type meta: DatasetTypeMeta{name='co.cask.cdap.api.dataset.lib.cube.Cube', modules=[DatasetModuleMeta{name='orderedTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryTableModule', jarLocation=null, types=[table, co.cask.cdap.api.dataset.table.Table], usesModules=[], usedByModules=[core, objectMappedTable, cube, usage, lineage]}, DatasetModuleMeta{name='metricsTable-memory', className='co.cask.cdap.data2.dataset2.module.lib.inmemory.InMemoryMetricsTableModule', jarLocation=null, types=[co.cask.cdap.data2.dataset2.lib.table.inmemory.InMemoryMetricsTable, co.cask.cdap.data2.dataset2.lib.table.MetricsTable], usesModules=[], usedByModules=[cube]}, DatasetModuleMeta{name='cube', className='co.cask.cdap.data2.dataset2.lib.table.CubeModule', jarLocation=null, types=[co.cask.cdap.api.dataset.lib.cube.Cube, cube], usesModules=[orderedTable-memory, metricsTable-memory], usedByModules=[]}]}
08-Apr-2018 00:02:29 2018-04-08 00:02:29,159 - WARN  [netty-executor-0:c.c.c.d.d.d.t.DatasetTypeManager@360] - Deleting all modules from namespace namespace:default
08-Apr-2018 00:02:29 2018-04-08 00:02:29,282 - INFO  [netty-executor-1:o.a.h.h.q.s.SessionState@586] - Created local directory: /tmp/CG-DCCP5-JOB1/31f574fb-6c5c-487f-8a81-ee6d0091bf94_resources
08-Apr-2018 00:02:29 2018-04-08 00:02:29,287 - INFO  [netty-executor-1:o.a.h.h.q.s.SessionState@586] - Created HDFS directory: /tmp/CG-DCCP5-JOB1/junit6402254617002480524/hive/tmp/bamboo/31f574fb-6c5c-487f-8a81-ee6d0091bf94
08-Apr-2018 00:02:29 2018-04-08 00:02:29,292 - INFO  [netty-executor-1:o.a.h.h.q.s.SessionState@586] - Created local directory: /tmp/CG-DCCP5-JOB1/bamboo/31f574fb-6c5c-487f-8a81-ee6d0091bf94
08-Apr-2018 00:02:29 2018-04-08 00:02:29,299 - INFO  [netty-executor-1:o.a.h.h.q.s.SessionState@586] - Created HDFS directory: /tmp/CG-DCCP5-JOB1/junit6402254617002480524/hive/tmp/bamboo/31f574fb-6c5c-487f-8a81-ee6d0091bf94/_tmp_space.db
08-Apr-2018 00:02:29 2018-04-08 00:02:29,300 - INFO  [netty-executor-1:o.a.h.h.q.s.SessionState@488] - No Tez session required at this point. hive.execution.engine=mr.
08-Apr-2018 00:02:29 2018-04-08 00:02:29,302 - INFO  [netty-executor-1:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:29 2018-04-08 00:02:29,303 - INFO  [netty-executor-1:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:29 2018-04-08 00:02:29,305 - INFO  [netty-executor-1:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=parse start=1523145749303 end=1523145749305 duration=2 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:29 2018-04-08 00:02:29,305 - INFO  [netty-executor-1:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:29 2018-04-08 00:02:29,308 - INFO  [netty-executor-1:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 2: get_table : db=default tbl=stream_weblogs
08-Apr-2018 00:02:29 2018-04-08 00:02:29,309 - INFO  [netty-executor-1:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=get_table : db=default tbl=stream_weblogs        
08-Apr-2018 00:02:29 2018-04-08 00:02:29,565 - INFO  [netty-executor-1:o.a.h.h.q.Driver@433] - Semantic Analysis Completed
08-Apr-2018 00:02:29 2018-04-08 00:02:29,566 - INFO  [netty-executor-1:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=semanticAnalyze start=1523145749305 end=1523145749566 duration=261 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:29 2018-04-08 00:02:29,567 - INFO  [netty-executor-1:o.a.h.h.q.Driver@239] - Returning Hive schema: Schema(fieldSchemas:null, properties:null)
08-Apr-2018 00:02:29 2018-04-08 00:02:29,567 - INFO  [netty-executor-1:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=compile start=1523145749302 end=1523145749567 duration=265 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:29 2018-04-08 00:02:29,570 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:29 2018-04-08 00:02:29,571 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:29 2018-04-08 00:02:29,572 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.Driver@159] - Concurrency mode is disabled, not creating a lock manager
08-Apr-2018 00:02:29 2018-04-08 00:02:29,572 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:29 2018-04-08 00:02:29,572 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.Driver@1317] - Starting command: DROP TABLE IF EXISTS stream_weblogs
08-Apr-2018 00:02:29 2018-04-08 00:02:29,573 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=TimeToSubmit start=1523145749571 end=1523145749573 duration=2 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:29 2018-04-08 00:02:29,574 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:29 2018-04-08 00:02:29,574 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=task.DDL.Stage-0 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:29 2018-04-08 00:02:29,575 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.Driver@1636] - Starting task [Stage-0:DDL] in serial mode
08-Apr-2018 00:02:29 2018-04-08 00:02:29,575 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 5: get_table : db=default tbl=stream_weblogs
08-Apr-2018 00:02:29 2018-04-08 00:02:29,576 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=get_table : db=default tbl=stream_weblogs        
08-Apr-2018 00:02:29 2018-04-08 00:02:29,576 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.m.HiveMetaStore$HMSHandler@575] - 5: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
08-Apr-2018 00:02:29 2018-04-08 00:02:29,578 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.m.ObjectStore@269] - ObjectStore, initialize called
08-Apr-2018 00:02:29 2018-04-08 00:02:29,585 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.m.MetaStoreDirectSql@132] - Using direct SQL, underlying DB is DERBY
08-Apr-2018 00:02:29 2018-04-08 00:02:29,586 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.m.ObjectStore@252] - Initialized ObjectStore
08-Apr-2018 00:02:29 2018-04-08 00:02:29,618 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 5: get_table : db=default tbl=stream_weblogs
08-Apr-2018 00:02:29 2018-04-08 00:02:29,620 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=get_table : db=default tbl=stream_weblogs        
08-Apr-2018 00:02:29 2018-04-08 00:02:29,642 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 5: drop_table : db=default tbl=stream_weblogs
08-Apr-2018 00:02:29 2018-04-08 00:02:29,643 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=drop_table : db=default tbl=stream_weblogs        
08-Apr-2018 00:02:30 2018-04-08 00:02:30,902 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=runTasks start=1523145749574 end=1523145750902 duration=1328 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:30 2018-04-08 00:02:30,903 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=Driver.execute start=1523145749572 end=1523145750903 duration=1331 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:30 2018-04-08 00:02:30,904 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.s.SessionState$LogHelper@852] - OK
08-Apr-2018 00:02:30 2018-04-08 00:02:30,905 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:30 2018-04-08 00:02:30,906 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=releaseLocks start=1523145750905 end=1523145750906 duration=1 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:30 2018-04-08 00:02:30,906 - INFO  [HiveServer2-Background-Pool: Thread-213:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=Driver.run start=1523145749570 end=1523145750906 duration=1336 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:30 2018-04-08 00:02:30,991 - INFO  [explore-handle-timeout:o.a.h.h.q.l.PerfLogger@121] - <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:30 2018-04-08 00:02:30,991 - INFO  [explore-handle-timeout:o.a.h.h.q.l.PerfLogger@148] - </PERFLOG method=releaseLocks start=1523145750991 end=1523145750991 duration=0 from=org.apache.hadoop.hive.ql.Driver>
08-Apr-2018 00:02:31 2018-04-08 00:02:31,074 - INFO  [main:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.1
08-Apr-2018 00:02:31 2018-04-08 00:02:31,082 - INFO  [main:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.60
08-Apr-2018 00:02:31 2018-04-08 00:02:31,089 - INFO  [main:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.3600
08-Apr-2018 00:02:31 2018-04-08 00:02:31,098 - INFO  [main:c.c.c.m.s.DefaultMetricDatasetFactory@99] - FactTable created: metrics.v2.table.ts.2147483647
08-Apr-2018 00:02:31 2018-04-08 00:02:31,139 - INFO  [main:c.c.c.i.a.n.DefaultNamespaceAdmin@253] - All data for namespace 'namespace:default' deleted.
08-Apr-2018 00:02:31 2018-04-08 00:02:31,140 - INFO  [MetricsQueryService STOPPING:c.c.c.m.q.MetricsQueryService@115] - Stopping Metrics Service...
08-Apr-2018 00:02:31 2018-04-08 00:02:31,144 - INFO  [LocalSchedulerService STOPPING:c.c.c.i.a.r.s.AbstractSchedulerService@98] - Stopped stream size scheduler
08-Apr-2018 00:02:31 2018-04-08 00:02:31,146 - INFO  [LocalSchedulerService STOPPING:c.c.c.i.a.r.s.AbstractSchedulerService@106] - Stopped time scheduler
08-Apr-2018 00:02:31 2018-04-08 00:02:31,147 - INFO  [ExploreExecutorService STOPPING:c.c.c.e.e.ExploreExecutorService@115] - Stopping ExploreExecutorService...
08-Apr-2018 00:02:31 2018-04-08 00:02:31,152 - INFO  [Hive14ExploreService STOPPING:c.c.c.e.s.h.BaseHiveExploreService@335] - Stopping BaseHiveExploreService...
08-Apr-2018 00:02:41 2018-04-08 00:02:41,154 - INFO  [Hive14ExploreService STOPPING:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 6: Shutting down the object store...
08-Apr-2018 00:02:41 2018-04-08 00:02:41,155 - INFO  [Hive14ExploreService STOPPING:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=Shutting down the object store...        
08-Apr-2018 00:02:41 2018-04-08 00:02:41,155 - INFO  [Hive14ExploreService STOPPING:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 6: Metastore shutdown complete.
08-Apr-2018 00:02:41 2018-04-08 00:02:41,155 - INFO  [Hive14ExploreService STOPPING:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=Metastore shutdown complete.        
08-Apr-2018 00:02:41 2018-04-08 00:02:41,155 - INFO  [Hive14ExploreService STOPPING:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 6: Shutting down the object store...
08-Apr-2018 00:02:41 2018-04-08 00:02:41,156 - INFO  [Hive14ExploreService STOPPING:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=Shutting down the object store...        
08-Apr-2018 00:02:41 2018-04-08 00:02:41,156 - INFO  [Hive14ExploreService STOPPING:o.a.h.h.m.HiveMetaStore$HMSHandler@732] - 6: Metastore shutdown complete.
08-Apr-2018 00:02:41 2018-04-08 00:02:41,156 - INFO  [Hive14ExploreService STOPPING:o.a.h.h.m.HiveMetaStore$HMSHandler@358] - ugi=bamboo        ip=unknown-ip-addr        cmd=Metastore shutdown complete.        
08-Apr-2018 00:02:41 2018-04-08 00:02:41,397 - INFO  [DatasetService:c.c.c.d.d.d.s.DatasetService@215] - Stopping DatasetService...
08-Apr-2018 00:02:44 2018-04-08 00:02:44,400 - INFO  [NettyHttpService STOPPING:c.c.c.d.d.d.s.DatasetTypeHandler@88] - Stopping DatasetTypeHandler
08-Apr-2018 00:02:44 2018-04-08 00:02:44,403 - INFO  [DatasetOpExecutorService STOPPING:c.c.c.d.d.d.s.e.DatasetOpExecutorService@101] - Stopping DatasetOpExecutorService...
08-Apr-2018 00:02:44 Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.103 sec
08-Apr-2018 00:02:44
08-Apr-2018 00:02:44 Results :
08-Apr-2018 00:02:44
08-Apr-2018 00:02:44 Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
08-Apr-2018 00:02:44
08-Apr-2018 00:02:44 [INFO]
08-Apr-2018 00:02:44 [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ cdap-cube-guide ---
08-Apr-2018 00:02:44 [INFO] Building jar: /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1/target/cdap-cube-guide-1.0.0.jar
08-Apr-2018 00:02:45 [INFO]
08-Apr-2018 00:02:45 [INFO] --- maven-bundle-plugin:2.3.7:bundle (default) @ cdap-cube-guide ---
08-Apr-2018 00:02:45 [INFO] ------------------------------------------------------------------------
08-Apr-2018 00:02:45 [INFO] BUILD SUCCESS
08-Apr-2018 00:02:45 [INFO] ------------------------------------------------------------------------
08-Apr-2018 00:02:45 [INFO] Total time: 48.141s
08-Apr-2018 00:02:45 [INFO] Finished at: Sun Apr 08 00:02:45 UTC 2018
08-Apr-2018 00:02:46 [INFO] Final Memory: 58M/407M
08-Apr-2018 00:02:46 [INFO] ------------------------------------------------------------------------
08-Apr-2018 00:02:46 Parsing test results under /var/bamboo/xml-data/build-dir/CG-DCCP5-JOB1...
08-Apr-2018 00:02:46 Finished task 'clean package' with result: Success
08-Apr-2018 00:02:46 Running post build plugin 'Docker Container Cleanup'
08-Apr-2018 00:02:46 Running post build plugin 'NCover Results Collector'
08-Apr-2018 00:02:46 Running post build plugin 'Clover Results Collector'
08-Apr-2018 00:02:46 Running post build plugin 'npm Cache Cleanup'
08-Apr-2018 00:02:46 Running post build plugin 'Artifact Copier'
08-Apr-2018 00:02:46 Finalising the build...
08-Apr-2018 00:02:46 Stopping timer.
08-Apr-2018 00:02:46 Build CG-DCCP5-JOB1-89 completed.
08-Apr-2018 00:02:46 Running on server: post build plugin 'NCover Results Collector'
08-Apr-2018 00:02:46 Running on server: post build plugin 'Build Hanging Detection Configuration'
08-Apr-2018 00:02:46 Running on server: post build plugin 'Clover Delta Calculator'
08-Apr-2018 00:02:46 Running on server: post build plugin 'Maven Dependencies Postprocessor'
08-Apr-2018 00:02:46 All post build plugins have finished
08-Apr-2018 00:02:46 Generating build results summary...
08-Apr-2018 00:02:46 Saving build results to disk...
08-Apr-2018 00:02:46 Logging substituted variables...
08-Apr-2018 00:02:46 Indexing build results...
08-Apr-2018 00:02:46 Finished building CG-DCCP5-JOB1-89.