Skip to content
kweekly edited this page Jul 27, 2012 · 1 revision

Below is a list of tricks to do which can help you get started. Feel free to add snippets of your own.

Setting up your .bashrc

On sensezilla.berkeley.edu, the environment variables SENSEZILLA_DIR, SENSEZILLA_MISSION, and PATH have already been set correctly by the profile of the system. If you want to point these to a different directory or mission, or if you want to set it up on another computer, you should add this to the end of your .bashrc:

SENSEZILLA_DIR="INSERT DIRECTORY OF THE PYTHON REPO HERE"  
SENSEZILLA_MISSION="server"               <---- Change if you want to load a different .conf 
PATH="$SENSEZILLA_DIR/bin:$PATH"

export SENSEZILLA_DIR
export SENSEZILLA_MISSION

Grabbing an arbitrary CSV file to test with

Sometimes its nice to have a CSV file handy. Here's how to get one with fetcher.py

First, take a look at available sources/devices to retrieve from:

$ fetcher.py list
Source name: openbms
        driver     : SMAP
        intr       : 1000
        invr       : 1000
        type       : TIMESERIES
        url        : http://new.openbms.org/backend/

        Devices
                8fbe97ef-37ba-5b5b-ba52-d1f643b34045
                c82314ff-06bb-5b38-9caf-1c8453edf1f0

... Output continues

Now we have a source name and device id to grab from. We can now grab the last day's data from this id.

$ fetcher.py fetch openbms 8fbe97ef-37ba-5b5b-ba52-d1f643b34045 test.csv --from -1d
Fetching from 07/26/2012 15:51:32 to 07/27/2012 15:51:32
URL: http://new.openbms.org/backend/api/data/uuid/8fbe97ef-37ba-5b5b-ba52-d1f643b34045?starttime=1343343092000&endtime=1343429492000&format=csv&tags=&
PROGRESS STEP 1 OF 1 "FETCHING URL" 0.00 MB DONE
PROGRESS STEP 1 OF 1 "FETCHING URL" 0.00 MB DONE
PROGRESS STEP 1 OF 1 "FETCHING URL" 0.00 MB DONE
PROGRESS STEP 1 OF 1 "FETCHING URL" 0.23 MB DONE
PROGRESS STEP 1 OF 1 "FETCHING URL" 0.50 MB DONE
PROGRESS STEP 1 OF 1 "FETCHING URL" 0.77 MB DONE
PROGRESS STEP 1 OF 1 "FETCHING URL" 1.03 MB DONE
PROGRESS STEP 1 OF 1 "FETCHING URL" 1.30 MB DONE
PROGRESS STEP 1 OF 1 "FETCHING URL" 1.56 MB DONE
PROGRESS STEP 1 OF 1 "FETCHING URL" 1.83 MB DONE
PROGRESS STEP 1 OF 1 "FETCHING URL" 2.06 MB DONE
$ ls -lh test.csv
-rw-r--r-- xxx xxx 2.1M 2012-07-27 15:51 test.csv

Running a flow locally on a CSV file

This is a good way to practice running your program using the flow mechanisms instead of specifying everything on the command line yourself.

Here, the csv_bounds.pl program is an easy way to specify that the flow should be run on the whole csv file (note that we specify a time scale of 1000, since the first column of test.csv is in milliseconds).

$ run_flow.py run lib_builder --local `csv_bounds.pl test.csv 1000` smapcsv test.csv 
WARNING: couldn't connect to  mod_config
Created file : testing_lib_builder_common-fetcher.O0
Created file : testing_lib_builder_common-minfilter.O0
Created directory : testing_lib_builder_libbuilder.O0
Executing python /mnt/DATA/documents/UCB/Singapore/sensezilla-python/bin/fetcher.py fetch --from 1343343092 --to 1343429490 smapcsv test.csv testing_lib_builder_common-fetcher.O0

Warning: test.csv not found in known devices
Calling /usr/bin/perl -w /mnt/DATA/documents/UCB/Singapore/sensezilla-python/bin/csv_merge.pl 1343343092000 1343429490000 testing_lib_builder_common-fetcher.O0 test.csv
Executing env LD_LIBRARY_PATH=/mnt/DATA/documents/UCB/Singapore/sensezilla-python/../sensezilla-cpp/bin /mnt/DATA/documents/UCB/Singapore/sensezilla-python/../sensezilla-cpp/bin/fast_filter -csvin testing_lib_builder_common-fetcher.O0 -csvout testing_lib_builder_common-minfilter.O0 -outprec 12 -intr 1000 -invr 1000

Decreasing memory limit from -1B to 2GB
Reading CSV File: testing_lib_builder_common-fetcher.O0
Read 86372 points in 1 columns.
PROGRESS STEP 1 OF 2 "Min Filter"  DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 1/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 2/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 3/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 4/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 5/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 6/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 7/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 8/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 9/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 10/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 11/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 12/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 13/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 14/15 iterations DONE
PROGRESS STEP 2 OF 2 "Spike Filter" 15/15 iterations DONE
0x13bb3b0 4626.00 42.10
Writing CSV file: testing_lib_builder_common-minfilter.O0
Wrote 86372 points in 1 columns.
Executing env LD_LIBRARY_PATH=/mnt/DATA/documents/UCB/Singapore/sensezilla-python/../sensezilla-cpp/bin /mnt/DATA/documents/UCB/Singapore/sensezilla-python/../sensezilla-cpp/bin/library_builder -csvin testing_lib_builder_common-minfilter.O0 -outdir testing_lib_builder_libbuilder.O0 -outprec 13

Decreasing memory limit from -1B to 2GB
Reading CSV File: testing_lib_builder_common-minfilter.O0
Read 86372 points in 1 columns.
PROGRESS STEP 1 OF 5 "Detect Transitions"  DONE
Threshold: 1.95 (10.00% of 19.55)
16 Transitions found

PROGRESS STEP 2 OF 5 "Separate Time Chunks"  DONE
PROGRESS STEP 3 OF 5 "Order by Size"  DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 6.67% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 13.33% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 20.00% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 26.67% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 33.33% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 40.00% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 46.67% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 53.33% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 60.00% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 66.67% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 73.33% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 80.00% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 86.67% DONE
PROGRESS STEP 4 OF 5 "Categorize by Means" 93.33% DONE
Cluster  0 : Mean      44.71
        Chunk   0: t:   4497.00 mean:     41.51
        Chunk   1: t:    114.00 mean:     48.00
        Chunk   2: t:  11228.00 mean:     47.95
        Chunk   3: t:  17294.00 mean:     44.82
        Chunk   4: t:   3162.00 mean:     49.89
        Chunk   5: t:  14986.00 mean:     41.34
        Chunk   6: t:  15587.00 mean:     45.36
        Chunk   7: t:  16138.00 mean:     47.36
        Chunk   8: t:  16789.00 mean:     40.99
        Chunk   9: t:  14488.00 mean:     45.60
        Chunk  10: t:  14005.00 mean:     40.90
        Chunk  11: t:  18691.00 mean:     41.65
        Chunk  12: t:   4352.00 mean:     45.45
        Chunk  13: t:  16687.00 mean:     44.94
        Chunk  14: t:  13933.00 mean:     44.87
PROGRESS STEP 5 OF 5 "Write out chunks" 0.00% DONE
Writing CSV file: testing_lib_builder_libbuilder.O0/0.0.csv
Wrote 6730 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.1.csv
Wrote 3049 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.2.csv
Wrote 2705 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.3.csv
Wrote 1398 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.4.csv
Wrote 1190 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.5.csv
Wrote 602 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.6.csv
Wrote 551 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.7.csv
Wrote 549 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.8.csv
Wrote 506 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.9.csv
Wrote 498 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.10.csv
Wrote 482 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.11.csv
Wrote 149 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.12.csv
Wrote 146 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.13.csv
Wrote 103 points in 1 columns.
Writing CSV file: testing_lib_builder_libbuilder.O0/0.14.csv
Wrote 73 points in 1 columns.
Done.
DONE RUNNING FLOW

Clone this wiki locally