Categories
Google Cloud Technology

Serverless Days Sydney 2019

I attended Serverless Days in Sydney Today. Overall it had a good sense of community, the venue and food was top notch. This is a community run conference and there’s always good representation from the main serverless cloud providers here. *Cough* Google/AWS/Microsoft *Cough*. Alibaba made an appearance too.

Are we at a serverless tipping point?

Jeff Hollan – a Senior Program Manager for Microsoft Azure functions doesn’t think we are there yet but we are getting close and you can help by sharing your learning and feedback. Here is my getting started with google cloud serverless functions on google cloud blog post. You can check out one of Jeff’s previous talks on Azure Functions for enterprise here:

Can we apply a serverless mindset?

Ben Kehoe – a Cloud Robotics Research Scientist at @iRobot gave us lessons learned in applying this mindset to more than just web applications. He goes over how as engineers we need to shift our mindset to focus on delivering business value and not get caught up on feeling like the centre of the business. Any code your write is a liability. That’s why I advocate for lean test code that adds value.

On a side note, anyone who gets to work in robotics professionally is super cool in my books. My favourite presentation is one where I present Tappy McTapface – a robot for mobile app testing. You can watch a previous talk by Ben on a similar idea here:

Did you miss out on the action?

Some of the speakers have given their presentations at previous conferences/meetup groups. Like Jessica Flanagan at YOW! Data this year. And Denis Bauer at YOW! Perth last year.

I Previously live streamed the Serverless Sydney meetup group and the Node user group on twitch (however these video’s are only stored on twitch for up to 60 days post live stream).

Serverless Icon Showdown

Who has the best logo? There’s Google Cloud Functions, AWS Lambda Functions, Microsoft Azure Functions and Alibaba Cloud Function Compute. Handsdown I think Microsoft Azure functions is the best logo here.

Are you using Serverless in production yet? what are some of the challenges you are facing? How should testers get up to speed with this technology?

Categories
Google Cloud Technology

Getting started with Google Cloud server less functions

Do you know how many lines of code you need to deploy a google cloud server less function?

Short Answer – 4

exports.helloWorld = (req, res) => {
  let message = req.query.message || req.body.message || 'Hello World!';
  res.status(200).send(message);
};

https://cloud.google.com/functions/docs/quickstart-nodejs

Wait, what’s a server less function?

Have you heard of micro services? This is like that but even smaller. What’s the smallest service you could possible deploy? a single function. Server less still uses a server, but the infrastructure is abstracted away by your cloud provider of choice. If no one is hitting your API, there’s no server running and costing you money. Cloud providers spin up an instance when needed as soon as something calls your function.

If you are interested in learning more, you should go to Severless Sydney on Saturday the 27th of August. Tickets are $69. https://sydney.serverlessdays.io

How do you integrate a server less function with a mySQL backend?

const mysql = require('mysql');
const connectionName =
process.env.INSTANCE_CONNECTION_NAME || 'boglogger:australia-southeast1:boglogger2';
const dbUser = process.env.SQL_USER || 'DEMOUSER';
const dbPassword = process.env.SQL_PASSWORD || 'DEMOPASSWORD';
const dbName = process.env.SQL_NAME || 'bog_logger';
const mysqlConfig = {
connectionLimit: 1,
user: dbUser,
password: dbPassword,
database: dbName,
};
if (process.env.NODE_ENV === 'production') {
mysqlConfig.socketPath = `/cloudsql/${connectionName}`;
}
// Connection pools reuse connections between invocations,
// and handle dropped or expired connections automatically.
let mysqlPool;
exports.logBog = (req, res) => {
// Initialize the pool lazily, in case SQL access isn't needed for this
// GCF instance. Doing so minimizes the number of active SQL connections,
// which helps keep your GCF instances under SQL connection limits.
if (!mysqlPool) {
mysqlPool = mysql.createPool(mysqlConfig);
}
mysqlPool.query('SELECT NOW() AS now', (err, results) => {
if (err) {
console.error(err);
res.status(500).send(err);
} else {
res.send(JSON.stringify(results));
}
});
mysqlPool.query('INSERT INTO bogs_logged(time, device_uuid, type, amount, floater) VALUES(NOW(), "e5c15033-42b3-487d-b6c8-17a1b592e9bd", 2, "S", TRUE)', (err, results) => {
if (err) {
console.error(err);
res.status(500).send(err);
} else {
res.send(JSON.stringify(results));
}
});
};

https://cloud.google.com/functions/docs/sql

How about getting JSON out of a request?

const escapeHtml = require('escape-html');
exports.helloContent = (req, res) => {
  let name;

  switch (req.get('content-type')) {
    // '{"name":"John"}'
    case 'application/json':
      name = req.body.name;
      break;
  }

  res.status(200).send(`Hello ${escapeHtml(name || 'World')}!`);
};

https://cloud.google.com/functions/docs/writing/http

Combining HTTP and MySQL

Now let’s combine the two, let’s build an API that logs your device ID and your poo type into a google cloud instance:

const escape = require('escape-html');
const mysql = require('mysql');
const connectionName =
process.env.INSTANCE_CONNECTION_NAME || 'boglogger:asia-northeast1:boglogger-prod';
const dbUser = process.env.SQL_USER || 'addBogs';
const dbPassword = process.env.SQL_PASSWORD || '';
const dbName = process.env.SQL_NAME || 'bog_logger';
const mysqlConfig = {
connectionLimit: 1,
user: dbUser,
password: dbPassword,
database: dbName,
};
if (process.env.NODE_ENV === 'production') {
mysqlConfig.socketPath = /cloudsql/${connectionName};
}
let mysqlPool;
exports.addBog = (req, res) => {
let device_id;
let type;
if (!mysqlPool) {
mysqlPool = mysql.createPool(mysqlConfig);
}
switch (req.get('content-type')) {
// '{"device_id":"123", "type": 2}'
case 'application/json':
device_id = req.body.device_id;
type = req.body.type;
break;
}
res.status(200).send(BogLogged ${escape(device_id || '123')}!);
mysqlPool.query(INSERT INTO bogs_logged(time, device_uuid, type) VALUES(NOW(), "${escape(device_id)}", ${escape(type)}), (err, results) => {
if (err) {
console.error(err);
res.status(500).send(err);
} else {
res.send(JSON.stringify(results));
}
});
};

https://github.com/BugHunterSam/BogLogger/blob/master/day2Stream21-07-2019/severlessAPIAddBogs.js

My API is in production if you hit this API with a POST Request

https://asia-northeast1-boglogger.cloudfunctions.net/addBogs

With JSON

{

"device_id":"123abc",

"type": 3

}

I will log your entry in a mySQL backend

When you deploy that function from above you will need a few packages in your dependencies in your package.json file:

{
"name": "sample-http",
"version": "0.0.1",
"dependencies": {
"mysql": "2.17.1",
"escape-html": "1.0.3"
}
}

If you want to run your own API, you will need to create a google cloud SQL instance and add a user. You can use these commands to create a table and user.

--create tables and insert some test data
CREATE TABLE bogs_logged(time DATETIME NOT NULL, device_uuid VARCHAR(36) NOT NULL, type INT(1));

--create addBogs user
CREATE USER 'addBogs'@'%' IDENTIFIED BY '********';
GRANT SELECT, INSERT ON bog_logger.bogs_logged TO 'addBogs'@'%';

https://github.com/BugHunterSam/BogLogger/blob/master/day2Stream21-07-2019/SQL-users-and-triggers.txt

What’s the point of all of this?

I now have a server less API in production for my mobile wearable app to track your poo. Next Steps is building the UI!

Categories
Google Cloud Technology

Getting Started with Google Cloud, App Script and BigQuery

The humble spreadsheet is one of the most versatile digital tools that any tester has in their toolkit. I’m working on a data analytics project at InsightTimer where we are doing event tracking across our android and iOS apps using MParticle and BigQuery. I want to do some data analyst on the events coming through for new android builds. “Basic Sanity” testing of events is my goal. My first step towards this is to create a Google Sheet dashboard that integrates with BiqQuery via a javascript-esk language called App Scripts. App scripts is an amazing extension of the humble spreadsheet, it really adds a lot of extra functionality. I’m going to walk you through how you can get started with a similar project of your own.

1. Sign up for a google cloud trial

I used a test account; sammy.test.cloud@gmail.com to explore this for you and I collected screenshots along the way.

Navigate to https://cloud.google.com/ and sign up for all that cloud magic.

 

You will get $300 worth of credit to explore the cloud

 

You will be asked for credit card details 🙁 but at least there is no auto charge if you forget

 

Woot, the default project name is “My First Project”

 

Go to https://console.cloud.google.com/bigquery?p=bigquery-public-data to test out your big query set up. BigQuery should be enabled for new projects by default.

There’s a whole bunch of public data sets for you to explore

 

We are going to explore the hacker_news datasets in this example. This is actually a collection of the data (comments and articles) from this Hacker News website.

Previewing the table is a handy feature for getting an idea of the structure of the data

The preview gives you some sample data, an idea of columns and a way to explore

 

If you wanted to find the top ranked comments for author ‘4ad’ you can run a query like the following (remove the \ characters if you are copy pasting this query):

SELECT
  author,
  id,
  ranking,
  time_ts,
  text,
  parent 
FROM 
\`bigquery-public-data.hacker_news.comments\`
WHERE
author = '4ad'
AND deleted IS NULL
ORDER BY
ranking desc,
time_ts
LIMIT 10;

 

2. Setting up Google Sheets with App Script

Create a new sheet in your drive folder, I’ve named mine BigQueryTest. Open up the tools > Script Editor:

 

You will see a blank script editor. Copy & paste the code from https://developers.google.com/apps-script/quickstart/custom-functions 

Save, the first time you save it will ask you to name your project

 

I ended up calling mine BigQueryAppScriptIntegration, original I know

 

Run the onOpen function. The first time you run this app script it will ask you to set up permissions

 

Review the permissions

 

select your account

 

More Permissions

 

Whew, done. Now if you navigate back to your spreadsheet you should see some added menu items:

 

What happens when you click on the Prepare sheet… function?

Huh, fancy that. Our spreadsheet is updated

 

Then how about the Generate step-by-step function?

 

That’s pretty cool a new tab is added

 

You don’t need to understand google’s code to explore this. I find it’s better to learn by starting from scratch. This code gives you a guideline for what you can achieve but you will have a better understanding of how app script works by building out your own.

3. Enable App Script APIs in your google cloud project

Navigate to your google cloud console: https://console.cloud.google.com/apis/dashboard and click on enable APIs and services

 

search for App Script and click enable

 

4. Enable BigQuery APIs in your App Script

Navigate back to you App Script and go to Resources > Advanced Google settings

 

Enable BigQuery APIs

 

Delete your old app scrip code, copy and paste the first code sample from https://developers.google.com/apps-script/advanced/bigquery

Replace the XXXXX string with your project id, you get this from your google cloud console

 

 

Replace the query with 

'SELECT COUNT (DISTINCT id) FROM [bigquery-public-data:hacker_news.comments]'

 

Save and run the runQuery function. You will also need to set up permissions again the first time you run this.

 

Huh, that’s a new screen. I clicked on “Advanced” to see a similar permission screen I had previously encountered

 

Dam. I got another error message:

When I click on the details it gave me a handy url to navigate to

 

I went to the suggested URL and it turned out I need to enable another API somewhere else. That makes me thing I did it in the wrong spot in step 3, maybe that step wasn’t needed?

 

I wait a few moments and go back to my app script. I click on “Run Query”. No error messages this time. If you go to your drive folder you should see a new sheet called “Big Query Results” with one row of data:

 

You can test out this result is roughly correct by running the query directly in your google cloud query editor:

 

Now you are all set up with your API integrations. Next lesson will be building out your own dashboard that puts this set up into practice.