Google Vision API Image detection 101

Posted Leave a comment

We’re handling 200,000 images from customers so far, keeping the peace can be tough. User uploaded content is always questionable, and being a closed network I’d like to keep irritated customers to a minimum. Google Vision API allows you to filter out the bad images systematically. I’m just getting started working with the Vision API, and it turns out Google is Alpha’ing AutoML, which is a smarted version you can train your own models on! I can’t wait to give it a try and review it later.

GOOGLE – IF YOU CAN HEAR ME… PLEASE HOOK ME UP WITH AUTOML!!! 😛

1.) create a service account on google cloud platform by heading to
left menu : IAM & Admin : Service accounts

2.) obtain your key in JSON format from Google, don’t lose it or you’ll need to delete and recreate a new one.

3.) Save the JSON key as key.json and install python samples. Then lets test the API.

python3 -m pip install google.cloud  
git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git  
cd python-docs-samples  
mkdir python-docs-samples/keys  
cp json.key python-docs-samples/keys  
export GOOGLE_APPLICATION_CREDENTIALS='python-docs-samples/keys/json.key'  
python3 detect.py safe-search-uri $url

Safe search:  
adult: VERY_UNLIKELY  
medical: VERY_UNLIKELY  
spoofed: VERY_UNLIKELY  
violence: VERY_UNLIKELY  
racy: VERY_UNLIKELY

More than likely, if you are using a http:// , https:// , or gs:// hosted file, you will want to use the *-uri option. From what I can tell the non-uri options are used for locally stored filesystem images.

Instead of using safe-search-uri you have more options to choose from to select metadata.

faces               Detect faces in an image.

faces-uri           Detect faces in the file located in Google Cloud
                    Storage or the web.

labels              Detect labels in the file.

labels-uri          Detects labels in the file located in Google Cloud
                    Storage or on the Web.

landmarks           Detect landmarks in the file.

landmarks-uri       Detect landmarks in the file located in Google Cloud
                    Storage or on the Web.

text                Detect text in the file.

text-uri            Detect text in the file located in Google Cloud
                    Storage or on the Web.

logos               Detect logos in the file.

logos-uri           Detect logos in the file located in Google Cloud
                    Storage or on the Web.

safe-search         Detect unsafe features in the file.

safe-search-uri     Detects unsafe features in the file located in Google
                    Cloud Storage or on the Web.

properties          Detect image properties in the file.

properties-uri      Detects image properties in the file located in Google
                    Cloud Storage or on the Web.

web                 Detects web annotations given an image.

web-uri             Detects web annotations in the file located in Google
                    Cloud Storage.

web-geo             Detects web annotations given an image, using the
                    geotag metadata in the image to detect web entities.

web-geo-uri         Detects web annotations given an image in the file
                    located in Google Cloud Storage., using the geotag
                    metadata in the image to detect web entities.

crophints           Detects crop hints in an image.

crophints-uri       Detects crop hints in the file located in Google Cloud
                    Storage.

document            Detects document features in an image.

document-uri        Detects document features in the file located in
                    Google Cloud Storage.

So far so good, feel free to leave a comment with anything you’d like me to add.

Google Cloud Platform gcloud SDK

Posted Leave a comment

There is no way I can detail even 1% about how to use this tool, but I can tell you how I use it to get you started. Google released the gcloud SDK to control every aspect of Google Cloud Platform.

The Google cloud site is a good place to start learning, but this should get you started fairly quickly.

Installation
Installation Windows:

Installation is fairly easy, and only takes a couple steps.

Navigate to https://cloud.google.com/sdk/ Just run the application like any other install in windows, there is nothing hidden that will give you grief. Local admin privileges are not needed on any OS once you authenticate with google.

Installation Linux:

By default gcloud and gsutil are installed on all servers right off the bat when created. If you need to install it on an offsite machine, navigate to here, then just

tar -xvf google-cloud-sdk*.tar.gz  
cd google-cloud-sdk  
./install.sh
setup will be interactive  
Configuration:

configuration is fairly easy, once you install the SDK.
gcloud init launches an interactive Getting Started workflow for gcloud. It replaces gcloud auth login as the recommended command to execute after you install the Cloud SDK. gcloud init performs the following setup steps:

  • Authorizes gcloud and other SDK tools to access Google Cloud Platform using your user account credentials, or lets you select from accounts whose credentials are already available. gcloud init uses the same browser-based authorization flow as gcloud auth login.

  • Sets properties in a gcloud configuration, including the current project and the default Google Compute Engine region and zone.
    Most users run gcloud init to get started with gcloud. You can use subsequent gcloud init invocations to create new gcloud configurations or to reinitialize existing configurations.

  • Properties set by gcloud init are local and persistent. They are not affected by remote changes to your project. For instance, the default Compute Engine zone in your configuration remains stable, even if you or another user changes the project-level default zone in the Cloud Platform Console. You can resync your configuration at any time by rerunning gcloud init.

(Available since version 0.9.79. Run $ gcloud –version to see which version you are running.)

gcloud init
Will launch a browser on your local workstation for you to authenticate, just like you do for GMail.

Getting started

One thing I cannot recommend enough is running the following line of code once authenticated, It will set a default zone to use for your commands. us-central1-a is located here in Council Bluffs, IA. If you skip this step, if you do not specify a zone, it will ask EVERY command.

gcloud config set compute/zone $zone
Lets try a couple commands now that you are authenticated:

List instances

Sometimes you just need to know the IP address of a machine, keep note machines located in a zone other than us-central1-a you will need to specify which zone the machine is located in, if you don’t, don’t worry! It will ask!

gcloud compute instances list  
NAME                           ZONE          MACHINE_TYPE              PREEMPTIBLE INTERNAL_IP EXTERNAL_IP     STATUS  
ubuntu-server                  us-east1-c    n1-standard-1                         10.240.0.18                 TERMINATED  
buckets-n-backups              us-central1-a g1-small                              10.240.0.2  x.x.x.x TERMINATED  
grafana                        us-central1-a f1-micro                              10.240.0.25 x.x.x.x  RUNNING  

SSH to instance, no keys required

[justin@justin-nix ~]$gcloud compute ssh jackal
Warning: Permanently added 'x.x.x.x' (RSA) to the list of known hosts.
Last login: Wed Jun 29 01:50:09 2016 from x.x.x.x
[justin@local-pc ~]$

Copying files to instances:

Copying files from offsite to the cloud

Getting files to the cloud machines can be tricky, but there are a few ways to do it! If you need to directly access a machine offsite you can add your ssh key to the servers you need.

[justin@justin-nix ~]$ gcloud compute copy-files testfile.txt machine-name:/home/justin

and look… there it is:

[justin@jackal ~]$ pwd
/home/justin
[justin@jackal ~]$ ls -lth

-rw-rw-r--  1 justin justin    0 Jun 29 19:04 testfile.txt
Working with machines across datacenters:

Sometimes you need to set a machine to be close to your clients for one reason or another. Sometimes it is nice to access a one-off machine using gcloud, without modifing your defaults for the app.

It’s easy!

$ gcloud compute machine-name --project (project-name) --zone (zone-name)
Example
gcloud compute ghost-web1 --zone us-central-1a --project jello-shots-1123

will log you in to the machine if your user has project access.

$ gcloud projects list

✔ 15:44 justinpc ~ 1.109s
$ gcloud projects list

PROJECT_ID                         NAME                           PROJECT_NUMBER
hallowed-scene-xxxx                My First Project               xxxxxxxxxxxx
hotlinesinc.com:cosmic-answer-xxx  API Project                    xxxxxxxxxxxx
virtual-cluster-xxxx               Cloud Buildout                 xxxxxxxxxxxx
virtual-cluster-xxxx               virtual-cluster                xxxxxxxxxxxx

Google Cloud SQL Backup removal

Posted Leave a comment

I take backups using the Google Cloud API for Google Cloud SQL, and never implemented a trim process because we want longer term backups.

I use the following bash one-liners to clean our backups up when needed. The following command will list all backups existing

grep -v = inverted grep, find all lines without DELETED

tail -n +2 = Show all lines but the header row

cut -f -d ‘ ‘ = cut the first field, fields separated by spaces

> outfile = redirect output and save to file named ‘outfile’

gcloud beta sql backups list --instance $instancename | grep -v DELETED | tail -n +2 | cut -f1 -d ' ' > outfile

vim outfile and delete the backup lines you want to keep.

cat outfile | while read line; do `/opt/google-cloud-sdk/bin/gcloud beta sql backups delete $line --instance development --quiet --async`; done

This saves me hours of cleanup time, I’d rather delete them every couple weeks to a month for my use case.

bash script:

just enter instancename, and number of backups to keep. There is a 30 second sleep built in to confirm you want to delete, just ctrl-c before 30 seconds to cancel. It’s ready to go as a cronscript.

#!/bin/bash

#logfile trim cron scripts for cleaning google cloud SQL backups.
#this file can be run daily or weekly, it just keeps the bill down.

# Assign variables
backups="+40" # snapshots to keep, Approx. 30 days for me with auto snaps turned on as well.  
outfile="/tmp/sql-trim-output-file.tmp" # temp file for process use.  
sql_instance="hotlinesng" # this is the master instance name on cloud SQL.  
gcloud_location="/opt/google-cloud-sdk/bin/gcloud" # gcloud executable location.

#
# gcloud needs to be installed and in a path accessable by the user. This user or account must have the appropriate permissions.
# 
# gcloud beta sql backups list (lists backups) of $sql_instance | skips deleted backups because you cant delete them twice
# the tail command skips $dayskip amount | the cut command cuts the first column (f1) with a file delimited by spaces.
# it then save the gcloud clensed output to a file we can ingest into the next step.
# 
#$gcloud_location beta sql backups list --instance $sql_instance | grep -v DELETED | tail -n +$dayskip | cut -f1 -d " " > $outfile
$gcloud_location beta sql backups list --instance $sql_instance | grep -v DELETED | grep -v UNKNOWN_STATUS | grep -v OVERDUE | tail -n $backups > $outfile

echo "deleting the following snapshots in 30 seconds:"  
cat $outfile && sleep 30

# we cat the $outfile into a while loop, that loops over the file created by first step line by line until we're done with the file. |
# we use the quiet option to not ask you each line, and async to not wait for each command to completely erase.
cat $outfile | cut -f1 -d " " | while read line; do `/opt/google-cloud-sdk/bin/gcloud beta sql backups delete $line --instance $sql_instance --quiet --async`; done

# delete file in
rm -f $outfile && echo "" && echo "temp file deleted successfully"