The three most important things I learned in 2017

2017 has been a wild year, to say the least. From waking up wondering if closed-mindedness was becoming mainstream to waking up wondering about the likelihood of nuclear war, to waking up in any one of over a dozen countries, there has been something remarkably unstable about 2017 (even for me).

But amidst all the shifting there was also something very grounding about 2017. I married an amazing woman and I finally found satisfaction in a day-to-day office job at Postmates. From starting the year as a remote freelancer in Hong Kong to ending it hanging shelves in our new apartment in Oakland; both change itself, and the adaptation to less change have taught me a few things. Here are some things that come to mind.



The attainability of freedom

When I was a teenager, my family and I were visiting Yosemite National Park where I met a freelance photographer for National Geographic. At that moment, I knew I wanted to have a job that I do from anywhere and allowing me to drift wherever my life might take me.

Overlooking the tea plantations of Sri Lanka as clouds slowly filled the valley below –  while auditing an ecommerce site – I realized that it had actually happened. I had clients who were willing to pay me a fair wage and didn’t mind if I was in Sri Lanka, Israel or the Moon as long as the quality of work was good.

What made this work was the same things that make any client relationship work: communication and dedication. By the time I was in Sri Lanka, I had worked remotely for them for this client for several months in Hong Kong. I had made sure that I was responsive to email in the evenings and early mornings and willing to take phone calls at 11 pm. And to be fair, by the time I was traveling, I was working only up to 20 hours a week. This ensured that I could remain extra-dedicated despite beach and jungle time while keeping rice and curry on the table.

It was great while it lasted but with fiance and a wedding on the way, this life was short-lived but real nonetheless.


The importance of disconnecting

When I look at my phone today, I see my emails, Instagram feed, and Coinbase losses in what Android developer mode calls “monochromy.” This is part of an effort to do something that was introduced to me by Tristan Harris, former “Design Ethicist & Product Philosopher” at Google who now runs The effort is to use my phone as a tool, not a drug. It might sound crazy until you really acknowledge how well product designers and analysts are optimizing for our continued use of their products. Constant, continual use is what they want, and coincidentally, constant, continual use is a good way to describe addiction.

My black and white phone screen, quite surprisingly, has had a massive effect on my interest in continuing to pop in and out of apps when I open my phone. That and killing all the notifications that “aren’t from people” has been a great release from the expectation of instant gratification. It has been extremely freeing.

It can be grounding too. Last year, my then fiance and I decided to charge our phones outside of our bedroom at night. This too affected a remarkable increase in our communication with each other. We are now married and oddly I recognize more and more each day how important that relationship is to foster and maintain. There is no post on Facebook that is more important than one word exchanged between us. It’s true. But it’s strange that we have to remind ourselves.

The importance of forgetting

Javascript- whatever you thought you knew doesn’t matter. Now, even my dog has opinions on transpilers and he scoffs at me for using jQuery. That is the story of technology and the paradigms that come and go with it. I can confirm that this is true in life as well.

Holding on to things, as I learned when I moved into a 400sq ft. apartment in Hong Kong makes it hard to acquire new things. Holding on to old data is costly and it makes finding relevant data slower and more difficult. Holding on to old ideas takes up room for new ideas.

Early on at Postmates, I became aware of a lot of records in the database that were either inaccurate or had no value to users yet we were serving these records as pages nonetheless- asking Googlebot to crawl them and often allowing them to be indexed for searchers to find. These pages didn’t do anything besides shift Googlebot’s crawl budget away from the high-quality pages on the site.

I’ve never taken so much pleasure in cleaning. We disabled 10’s or 100’s of thousands of records at a time and gradually we watched as Googlebot hit these 404’s and gave up on its futile effort to index the bad pages. So far, we’ve seen crawl behavior shifting- crawl volume has not decreased overall but the number and frequency of good pages being crawled have increased. By discarding the bad, the good came to the light.

Change is not always that easy. Sometimes discarding is more difficult than making an API call. Sometimes clearing space means profiling, inspecting, and reconciling with what exists. I’ve come to believe that this, more than acquiring new information is the important part of growth.

This last year, as new models of acceptable male behavior, have become popularized, normalized, and sometimes flat out exposed, I have looked into a “Black Mirror” of sorts and seen what a lifetime of intentional or even passive ignorance becomes. Inspecting and reconciling with the signals of this ignorance has lead to a cognitive dissonance between what I know to be right and what I was comfortable believing. It was jarring to reconcile with my egalitarian principles and the societal norms that I had taken advantage of and implicitly endorsed. Clearing away the old junk brings more good to light.

This was also, in part, why I found the need to disconnect. I got tired of breathing the same air in my little corral of the Facebook graph. I got tired of seeing only what I was expected to agree with and I became suspicious of feedback loop that self-corrected me into a narrow view of the world. This began to look like the same conditions that lead to the societal norms I mentioned above. I wanted to leave room for something different- something outside of this little world to explore.



If you follow this blog or, you know about my passion for learning. I write it because it helps me learn and it encourages others to explore. My hope for 2018 is the same as my hope for this blog- that open-mindedness and an explorative nature spreads. I look forward to exploring React and getting involved in the tech education community here in Oakland. I hope that you consider leaving or making a little space for some exploration too.

Analyzing XML Sitemap Files with Bash

I’ve been spending a lot of time working with sitemaps lately. They are a critical component of getting sites indexed by search engines and they are a great way to learn about your competition’s architecture and search strategy. I mean, they literally map out the content that your competition is trying to add to search engine indexes and where they are trying attract traffic.

The problem is, when sitemaps are larger than a few thousand URL’s, or when sitemaps are not pretty, or when index files and gzip compression get involved, the task of reading a sitemap becomes either slow, or manual and really slow. So… Bash! Crashing to the ground like an ancient and overly cryptic superhero, er… programming language, Bash is here to face off with it’s perfect use case.

If you are intrigued but unfamiliar with Linux Shell aka, “the terminal,” this is a perfect way to get your hands dirty and decide if its something you want to become more familiar with.

This post will introduce the syntax, and a few powerful commands to allow you to download, parse, and analyze sitemap files with Bash.

A Background on Bash

BASH (the Borne Again SHell) has a several welcoming characteristics. Like Javascript is to web browsers; Bash is a standard programming language to Linux operating systems like Mac OS and Arduino. This means you don’t have to worry about installing it— its already in the box!

Speed is also a big upside. The language itself is fast, and since you don’t have to write a lot of code its possible to get a lot done quickly. Its far from being the Swiss Army knife that Python is, but with the right use case, Bash is as powerful as its name sounds like it should be.

On the flip side, the worst thing about Bash, unlike Python and other modern languages, is the lack of resources make it hard to learn. “Man(ual)” pages are overly terse and sometimes feel like a they were written in their own language. Hopefully, this intro will be a little bit more engaging.

Bash also has limited use cases. It’s great for working with file systems and text files but it wouldn’t be my first choice for assembling anything thats very interactive. But now that we have our use case. Let’s Bash!

The Short of It

In a two lines, we are going to request a sitemap index file, and write every sitemap URL in a text file.

curl | \
grep -e loc | sed 's|<loc>\(.*\)<\/loc>$|\1|g' > sitemaps.txt

We can then request those sitemap files and analyze the URL’s listed in them

curl | \
gunzip | grep -e loc|\  sed 's|<loc>\(.*\)<\/loc>$|\1|g' | \
grep -e <filter pattern> | sort

With Python, this would take several libraries, more time reading docs, and more code to accomplish the same thing. With Bash, we only need 6 commands: curl, grep, gunzip, sed, sort, and uniq.

Bash programs are assembled like Legos (or plumbing, if that suits you better). So in order to understand how this all works, let’s take these programs apart and put them back together.

The Long of It

As you can see, Bash code is really terse. There’s a lot going in with little instruction— and little explanation. But be patient and keep reading, and this should all make sense.

Piping | data

You may have recognized a pattern; each line is broken up into pieces by the pipe character, “|”. “Piping” conveniently does what it sounds like it does. It sends data along a pipeline of programs which then manipulate that data and pass it on. More specifically, they send the output of a program (Stdout), line by line, into the input (Stdin) of the next program. To demonstrate this try this:

ls | head -n 3

That line of code says list the contents of the current directory, pipe it to the head command to only output the first three lines of the input. Similarly:

cat my-file.csv | sort > my-sorted-file.csv

That line of code says read the contents of the my-file.csv and pipe it to the sort command to the sort the lines of that file alphabetically. The angle bracket (>) at the end of the line means, “put the output of these commands into a file named “my-sorted-file.csv.” It’s all pretty simple, and it will make more sense as we build that scripts above.

cURL and gunzip to GET and unzip Sitemap Files

Most of what the two scripts do is text processing and filtering but the first step is getting the sitemap files. You’ve probably come across curl if you’ve ever read through REST API documentation. It is a short command with a lot of options but, in our case, we can keep it simple and just use curl as is. To ease into the curl command, let’s use it to find the location of website’s sitemap.

Most sites list the uRL of their sitemap in their robots.txt file. Making a HTTP GET request to a site’s robots.txt file is simple:


Now is also a good time to introduce grep. You can use grep to skip scanning through the robots.txt file to quickly find the sitemap URL. Just pipe the output of the robots file to grep filtering for the regular expression pattern, “sitemap:”

curl | grep -e -i ‘sitemap:’

The result of this command should be the line of the robots.txt file that lists the location of the sitemap. Note that the grep -i means that the regular expression pattern match can be case-insensitive. That’s useful because some sites will start the line with a capital “S” in sitemap.

To get more comfortable with curl, try curl with the URL of this page and you will see that it prints a mess of HTML markup to the terminal window. See below for pretty-printing XML.

Many sites with really large sitemaps use gzip compression to reduce the amount of time it takes to download them. Using the curl command to get gzipped sitemaps doesn’t work too will— it will just print a mess of ugly binary code to the terminal Luckily, BASH has a built in command, gunzip, to unzip compressed sitemap files. To unzip zipped sitemap files just pipe the zipped response to gunzip and get the readable XML.

curl | gunzip

But what happens if the output is not pretty printed xml?

Bash has a built-in for formatting XML called xmlllint:

curl | gunzip | xmllint --format -

Finally, to avoid making tons of requests to the same sitemap while you’re learning, you can store the sitemap file in on your computer by sending the output to a file:

curl | gunzip > saved-sitemap.xml

Now, instead of requesting the sitemap every time, you can just ‘cat’ the file which outputs every line of a file from top to bottom like so:

cat saved-sitemap.xml | head

That will show the first ten lines of the saved file. Or use tail for the last ten lines.

Parsing the XML and Finding URLs

Hopefully, piping data from program to program is a little more comfortable now, because this is where we get to start taking advantage of the better parts of Bash.

XML sitemaps follow a strict format which makes it easy to parse them. But there is a lot of extra text in them that is not particularly interesting. To find the part that is of interest to us, the URLs, we will use grep which sounds like a burp, but is actually used to filter a stream that is piped to it.

curl | grep -e <regular expression filter pattern>

Grep reads each line of input and, if it matches the regular expression pattern after the -e, then it passes it along, filtering the contents of the input stream. We find all <loc/> XML tags by filtering out everything that doesn’t match the the expression, ‘loc’.

curl | grep -e ‘loc’

This is where things get interesting.

The sed command is short for Stream EDitor. Like other Bash commands, it operates on one line at a time. Sed can be a very powerful tool for rewriting file contents, but for our purposes, it will extract sitemap URLs from the <loc/> elements filtered in by the grep command.

curl | grep -e ‘loc’ | sed 's|<loc>\(.*\)<\/loc>|\1|'

This chain of commands gets the sitemap, filters in all <loc> elements, then extracts the URL from the <loc> elements leaving only a list of URLs– one per line.

Sed allows you to use regular expressions to determine what part of a line you want to keep or replace. There are five parts to sed expressions separated by, in this case, pipes ( | ). Noe that pipes could be any character like colons or backslashes but I like to use pipes because it looks cleaner when working with URLs.

The command works like this sed '<sed mode>|<regex to extract>|<regex to replace>|<flags>' Sed extracts anything between the escaped parenthesis \( and \) and stores them in the escaped \1 in the regex to replace in the command, “sed ‘s|<loc>\(.*\)<\/loc>$|\1|’. So everything, aka dot star between the <loc> and the <\loc> is extracted and replaced the extracted text (which is the URL)

That leaves the command that we started with:

curl | grep -e loc | \

sed 's:<loc>\(.*\)<\/loc>$:\1:g' | grep -e <pattern> | sort

With that Bash code, you will get all the URLs in a sitemap sorted alphabetically. Note that the \ at the end of the line is used to continue the commands to the next line. Add add | wc -l to the end of the line to get the number of lines in the output of those commands.

Now you have the tools to analyze not only large sitemap files but all kinds of large text files!

My interest in Bash has grown because it makes it so easy compose functions together to get and store the results that you need. You now know all the basic building blocks for analyzing a sitemap, so now try for yourself. Try using grep to see what, and how many, files live in certain directories. Use sed extract the first directory and pipe that to uniq to see how many first level directories there. Any question that you have about the contents of a sitemap, you can now answer with Bash— and fast! Good luck!

There are a couple great books out there if you want to learn more about Bash as a data analysis tool:

Bash Cookbook and Data Science at the Command Line

How I became a Technical Marketer

As you know, I love learning and I am a huge advocate for self-service education. Recently, I had the honor of being interviewed on the Measureschool Youtube channel where I got to share my path in learning and employing the technical side of digital marketing. We talked about some of the resources I mentioned in the series on Digital Marketing Technical Skills and in the long version I share my fascination with Github and Push Messaging.

I hope the video encourages others to start and/or continue their technical learning journey!

Julian Jueneman also just launched a community called which is a great place to learn from and share with others who are diving into all things technical marketing.

Meanwhile, I have been slow on this blog lately but only because I have been more involved in a couple other projects:

  • Keyword Visualization Tool: Currently in super-beta mode, but I would love it if you tested it out and provided some feedback on how it could be more useful. If you get Javascript errors, please share those with me too!  :*(
  • It is what the name implies. Learn about how to use Google Apps Script to make networked data solutions for yourself!
  • A fake-fake news site that allows you to make up stories and share them on social media to see how many people will click them. Here is the Medium article about what I learned in the process. Pretty crazy stuff!

Never stop learning!

JSON-LD Bulk Editor and File Generator for Google Sheets

As JSON-LD usage spreads across the internet and its utility increases, I find myself creating a lot more JSON-LD templates for non-standard entities… and then I find myself copying them over and over for each of their sibling entities. Finally, I decided I had had enough of this and looked to solve this problem for myself and other SEOs in this same boat.

The JSON-LD Bulk Editor for Google Sheets is a Google Spreadsheet template and script that takes a “prototype” JSON-LD template and allows to create as many customized template copies of it as you need by filling in a spreadsheet. It then allows you to save each JSON-LD schema into its own JSON file to add to your site or app.

JSON-LD Bulk Generator Use Cases

I admit, I believe in the future of the semantic web and because of that, I think there will be a lot more use of JSON-LD. Developers will be able to create a lot of interesting applications on top of all that nice and clean data and we will hopefully benefit from this easy access to information.

Google and others are already consuming JSON-LD but, because their primary focus is things like Products, Recipes, and Locations, developers have created JSON-LD generators for those specific and rigid schemas.

This tool allows you to create and edit the schema that perfectly fits your use case. 

Example Use Cases:

  • You need a lot of customized markup for non-standard entities like CreativeWorks or Intangibles subclasses
  • It’s not worth building a schema into your CMS just to markup a few similar but unique pages
  • You want to add JSON-LD to several similar but unique hard-coded HTML pages
  • You are using Google Tag Manager but don’t have access to the variables needed for the application/ld+json HTML tag.
  • Hosting a few .jsonld files is easier than modifying your database

If you are wondering about how to put JSON-LD onto the page with Google Tag Manager, learn more about embedding JSON-LD with Javascript.

How to Edit and Generate JSON-LD Files



Then follow these three steps: JSON LD Bulk Editor for Google Sheets Step 1

Step 1: Paste your “prototype” schema 

First, create a “prototype” schema that will be copied with unique values. Use a code editor to avoid any JSON syntax errors.

Then paste your prototype schema into the green box on the “Start” tab.

The schema of the prototype will be identical to that of its copies. You can modify the schema in the next step but it is easiest to start with a perfect prototype.

Step 2: Create and Edit the Spreadsheet Template

In the JSON-LD Bulk Editor menu, click Create schema editor.

Behind the scenes, Google Apps Script is “flattening” the JSON object and making a header row of object keys on the Generator tab and filling the second row with the values from the prototype object.

Edit JSON-LD in Google Spreadsheets


The header row shows the JSON object keys in “dot-notation” this is similar to how Google Tag Manager, MongoDB, and Javascript, in general, allow you to access values within objects. So if the schema is:

{"aggregateRating": { 
  "@type": "AggregateRating",
  "bestRating": "100",
  "ratingCount": "24",
  "ratingValue": "87" }

The dot-notation for rating value would be aggregateRating.ratingValue. This works for arrays in the same way. The 0th element of an array would be someArray.0 and the 12th element of the array would be someArray.12.

Now simply fill in the rows values that correspond to the header for each entity.

Step 3: Generate some JSON-LD

Now that each entity has been filled out in the spreadsheet, its time to make some files! Just go to the JSON-LD Bulk Editor menu and click Save JSON files to Drive.

Behind the scenes, Google Apps Script builds the JSON object based on the values in the spreadsheet. The schema is then saved to a Google Drive folder called “JSON-LD Schema {{timestamp}}”. Each file is saved under the file name that was assigned on the spreadsheet with the file extension, .jsonld.

Save JSON-LD files to Google Drive

Your .jsonld files are ready!

Make Some Semantic Web!

I hope you will find this tool as useful as I have. For an easy way to use these JSON files, check out:

For an easy way to use these JSON files, check out: embedding JSON-LD with Javascript.

If this works for you or you see any improvements that could be made, it would be great to hear from you in the comments.


Using Hosted JSON-LD Files as applications/ld+json Scripts for SEO

Sometimes it’s just easier to separate concerns. Just like how stylesheets and .js scripts separate the form and function of the page from the presentation of the page, the same can be done with the JSON-LD schematic markup / structured data of the page. This example shows how you can add structured data for SEO using Javascript which in many cases may prove to be much easier than messing around with your server side code or CMS.

The following Javascript script shows how you can load a stored .jsonld file from your server onto your page as an application/ld+json script tag. Just add it to your page to add hosted JSON-LD files to your page.

How to Add JSON-LD Markup with Javascript

The code below does the following four steps.

  1. The js makes a call to the local .jsonld file
  2. When the file is returned, an application/ld+json script tag is created
  3. The contents of the .jsonld file is inserted as the contents of the script tag
  4. The tag data is ready to be consumed by other applications

The script uses jQuery but the same could be achieved with plain js or any other library/framework of choice. This code can also be added by a tag management system such as Google Tag Manager.

   // Add a schema tag to your
   $.getJSON( "/your-schema-file.jsonld", function( data ) {
     $( "<script/>", {
       "type": "application/ld+json",
       "html": JSON.stringify(data)
     }).appendTo( "head" );

Does Google Think its Valid Structured Data?

Yes. See this demo for a live working example or go straight to Google’s Structured Data Testing Tool to see the valid results of the demo. Other crawlers may not recognize the script because the script is rendered to the page using Javascript. Therefore the crawler must be able to run javascript which is not all that common.

Creating .jsonld Files

To learn about every single minute detail of creating .jsonld files see this spec about syntax. But essentially, .jsonld files are no different syntactically than JSON files. It is only the specific way that the JSON-LD files signify entities that differs from JSON.

If you need to create multiple JSON-LD files, checkout this Bulk JSON-LD Generator for Google Sheets.

I hope you find this useful. I would love to hear your thoughts in the comments.

Google Apps Script doGet and doPost Tutorial + 6 Web App Examples

Google Apps Script Web Apps

(This post has been moved to a site all about Google Apps Script Tutorials.)

If you are new to Google Apps Script or programming in general, Google Apps Script Web Apps may seem daunting. The good news is that it is easy to understand with a little trial and error.

This tutorial will give you 12 recipes to help you get started with the doGet and doPost functions and show you how to return HTML, JSON, and plain text. From there, you can go on to customize your Web App to your needs.  But first, let’s get the basics out of the way.

What do doGet and doPost do?

Since you are starting to build a web app, you should probably know about HTTP. HTTP is the specification for how different machines on the web send and receive information from one another. HTTP allows several types of “requests,” but what we see most often are GET and POST requests. doGet and doPost functions define how a Google Apps Script web app scripts handle GET and POST requests from other machines and determine what they return or do.

A simple example of a GET request happens every time you enter a URL into your web browser. The web server at that domain then handles your request and returns an HTML web page. GET requests are the most common type of request on the internet. For our purposes, GET requests are made of up of a domain, e.g., a path, e.g., /fruits/apples/ and often a query string e.g.,?variety=fuji&count=4With Google Apps Script, the domain will always be, and the path will always look with something like /macros/s/AKf...ycb/exec (or /dev), and the query string can be whatever you want it to be, in order to specify how the web app dynamically generates the content it returns.

POST requests are similar to GET requests but slightly more powerful. Rather than asking for content, they are used to send data from one machine to another. The most common use of POST requests is web forms, like signup forms and ecommerce checkout pages. POST requests are not limited to sending form text though. They can transfer image, audio, and a number of other types of files. This tutorial will only cover JSON, but if you need more than that, you probably don’t need this tutorial!

Your First Web App “Hello World”

Let’s get a few easy examples out of the way. The next three code snippets serve essentially the same content in three different formats.

To try them out:

  1. Copy and paste them into a new Google Apps Script.
  2. Click Publish on the main menu and then Deploy as web app…
  3.  Click “Test web app for your latest code.”

Returning Plain Text

The ContentService.createTextOutput method is the most basic way to return content from a GET request but is the building block for most other types of content. The TextOutput objects created by this method returns a response of type, text/plain by default, but can be set to ATOM, CSV, ICAL, JAVASCRIPT, JSON, RSS, TEXT, VCARD, and XML.

function doGet(){

 textOutput = ContentService.createTextOutput("Hello World! Welcome to the web app.")
 return textOutput

Returning JSON

This is a simple pattern. Just serialize the Javascript object as JSON using JSON.stringify(), and set the content-type header with the setMimeType() method. This is important when returning different media types because you don’t want to let the recipient of your request guess the MIME type. For more on this, see my Google Apps script JSON REST API tutorial.

function doGet(){
 var appData = {
 "heading": "Hello World!",
 "body": "Welcome to the web app."

 var JSONString = JSON.stringify(appData);
 var JSONOutput = ContentService.createTextOutput(JSONString);
 return JSONOutput

Returning HTML

The HTMLService.createHtmlOutput() method has two purposes: to clean of any security risks before the HTML is returned to the browser and to implicitly set the MIME type to text/html. To construct web pages, you can assemble the HTML string, as shown below, or you can use templates.

function doGet(){
 var HTMLString = "<style> h1,p {font-family: 'Helvitica', 'Arial'}</style>" 
 + "<h1>Hello World!</h1>"
 + "<p>Welcome to the Web App";
 HTMLOutput = HtmlService.createHtmlOutput(HTMLString);
 return HTMLOutput

Dynamic Web Pages with the HTML Service

External Content, Query Strings, and HTML Templates

Let’s make this interesting. Google Apps Script can assemble web pages based on dynamic inputs. In the example below, we use an external CSS file and the query string parameters to create dynamic content. The doGet and doPost parameter take one default argument (commonly e), which holds information about the request “event.” From that e object, you can pull of information like query string values, as shown below. See the working version of the web app here.

To take this one step further, you can create HTML template files in Google Apps Script and evaluate those templates rather than creating the template in the script. Additionally, you can use information from the Google Drive, Gmail, and other Google APIs to populate your app.

function doGet(e){
  // use an externally hosted stylesheet
 var style = '<link href="" rel="stylesheet">';

 // get the query "greeting" parameter and set the default to "Hello" 
 var greeting = e.parameter.greeting || "Hello";
 // get the query "name" parameter and set the default to "World!"
 var name = || "World";
 // create and use a template 
 var heading = HtmlService.createTemplate('<h1><?= greeting ?> <?= name ?>!</h1>')
 // set the template variables
 heading.greeting = greeting; = name;
 var content = "<p>Welcome to the web app.</p>";
 var HTMLOutput = HtmlService.createHtmlOutput();
 return HTMLOutput

Google Apps Script doPost Example

An API Proxy

This example was inspired by my last post about custom Slack slash commands. This doPost example demonstrates how to handle POST requests, access request query string parameters, make HTTP requests to other services (in this case, the Chuck Norris joke API) with UrlFetchApp, and break the script into functions.

The reason I made this was because I got data from the Slack app that was formatted as a query string,


that I needed to use to get data from an API that would be formatted as JSON,

{ "type": "success", "value": { "id": 268, "joke": "Time waits for no man. Unless that man is Chuck Norris." } }

to then send back to Slack app as plain text.

Time waits for no man. Unless that man is Trevor Fox.

This problem also exists if you want to use another service that requires that data be formatted in a specific way- like the IFTTT Maker Channel.

function doPost(e){
 // Get the "text" query string value
 // eg. &text=Your%20Name
 var name = e.parameter.text
 // Get the Chuck Norris Joke
 var chuckQuote = postChuckNorris(name);
 // Return plain text Output
 return ContentService.createTextOutput(chuckQuote);

// Make a call to the Chuck Norris joke API
// parse the JSON and return only the joke
function postChuckNorris(name){

 var queryString = makeQueryString(name)
 var chuckData = UrlFetchApp.fetch("" + queryString);
 var chuckJSON = JSON.parse(chuckData.getContentText());
 var chuckQuote = chuckJSON.value.joke;
 return chuckQuote

// Helper function to assemble the query string for
// calling the Chuck Norris API from a given name
function makeQueryString(name){
 var query = ""
 if (name !== undefined){
  var names = name.split(" ");
  query = "?firstName=" + names[0];
  if (names[1] !== undefined){
   query += '&lastName=' + names[1];

 return query

More than Just Returning Content

Dynamically Create and Save a PDF File to Google Drive

This example is similar to the dynamic HTML example above but instead of making a dynamic web page, it creates and saves a PDF document in Google Drive. This would also work with doPost in a situation where you would want to send more information to the app when creating you PDF. Check out the live script. *Note, this will create a PDF file in you Google Drive each time you go to the URL.

This takes the functionality of the Google Apps Script web app one step further. This script demonstrates the capability to carry out any task or trigger any other service just from GET’ing or POST’ing to your web app.

function doGet(e){

 // get the query "greeting" parameter and set a default to "Hello" 
 var greeting = e.parameter.greeting || "Hello";
 // get the query "name" parameter and set a default to "World!"
 var name = || "World";
 // create the PDF
 var pdf = makePDF(greeting,name)

 // save the PDF to Drive
 var driveFile = DriveApp.createFile(pdf).setName("greetings.pdf");

 // tell the user how to access it
 var fileURL = driveFile.getUrl();
 var fileName = driveFile.getName();
 var HTMLOutput = HtmlService.createHtmlOutput("<p>Your made a PDF.</p>" 
 + "<p> You can download it here: "
 + '<a target="blank" href="' + fileURL + '">' + fileName + '</a></p>');
 return HTMLOutput

function makePDF(greeting, name){

 // use an externally hosted stylesheet
 var style = '<link href="" rel="stylesheet">';
 // create and use a template 
 var heading = HtmlService.createTemplate('<h1><?= greeting ?> <?= name ?>!</h1>')
 // set the template variables
 heading.greeting = greeting; = name;
 var content = "<p>Here is your PDF.</p>";
 var PDFOutput = HtmlService.createHtmlOutput('<div class="container">');
 var blob = Utilities.newBlob(PDFOutput.getContent(), "text/html", "text.html");
 var pdf = blob.getAs("application/pdf");
 return pdf

Publishing and Permissions

When you are ready to launch your new Google Apps Script web app. You should save a version and define who and how users and other machines can access your app. There are a couple important things to note.

If your app offers any capability to interact with other Google Drive services, the app should be set to Execute the App As:  User accessing the web app.

If you want to allow other services, to interact with your web app (as an API or proxy, the app should be accessible to Execute the App As: Me ( and allow Anyone, even anonymous.


I hope this gets you started! Let me know if you are having any trouble getting your app up and running in the comments. Happy hacking!

Slack Slash Commands for Everyone + 10 Free APIs

Slack_IconYou might be new to the whole idea of “bots” or acronyms like REST, API, HTTP, and cURL. And you probably don’t know how to code your way out of a wet paper bag. But this doesn’t mean you cannot or should not get your hands dirty and have a little fun making a customized little tool for yourself and your team. This is a quick and easy “Hello World” tutorial for creating your own Slack slash commands that anybody, no matter how technically savvy, can follow.

Custom slash commands for Slack allow you to access information from across the internet without leaving your favorite messaging tool.

Custom slash commands for Slack allow you to access information from across the internet without leaving your favorite messaging tool. In essence, slash commands allow your Slack app act as a Web browser, but instead of returning and displaying a fully rendered HTML web page, Slack displays a bit of text information from a slash command “bot.” It’s as simple as that.

How Custom Slash Commands Work

Slash commands work similar to the way your web browser retrieves an HTML web page found at the URL in the address bar. The difference is that the Slack app retrieves the plaintext information found at the URL you set in your custom slash command. In both cases, browser and the Slack app are sending an HTTP “GET” request to another server. Slack can also send a POST request to send information to a given location, but this tutorial is meant to be quick and easy for non-techies, so we are going to stick to retrieving plain text content.

So you are wondering, where can I get this text content if I can’t program a server to return something? Luckily there are plenty of plaintext APIs that return useful, interesting, and sometimes silly information. Check them out by skipping to the list of plaintext APIs below. Choose one of the APIs or find another one that you want to use, then test the API URL by pasting it into you

Choose one of the APIs or find another one that you want to use. Then test the API by pasting the URL into your browser. If the browser displays only text without any formatted HTML, you’re ready to make your slash command!

Create Your Custom Slash Command

Let’s get started! First, go to: {{yourSlackTeam}} and select Slash Commands. On the next page, click Add Configuration. Then just follow the next three steps.create a slack slash command

  1. Setup Your Request

    Once you have found the text API that you want to use in your slash command, enter the Command name, the URL of the API request, and leave the Method as “GET.”  For this demo, we are using the Lorem Ipsum generator API found here: Click the link to see what it provides.

    name your slash command

  2. Customize the Appearance

    Here is the fun part. You get to decide what your slash command’s response looks like. Choose a cute name and upload an image or choose from any one of the many Slack emojis.
    customize the slack bot appearance

  3. Help Users Use It

    For the Description, provide a short explanation of what your slash command will return to the user.
    describe your slash command

  4. Try it out!

    Type in your slash command or type part of the command and use the tab-complete to fill in the command. Hit enter and boom! You have your first working slash command.
    use your slack slach command

10 Free Plain Text APIs for Custom Slash Commands

Now that you know how to make your own slash command integration, here are a few API’s to play around with. If you made one that you want to share or found one that you like, please leave a link in the comment section.

  1. Lorem Ipsum: Just like the one in the demo, get all the Lorem Ipsum text you could ever ask for at
  2. Check Your IP Address: If you move around a lot and you need to know your IP address, just call this API:
  3. Get a Currency Exchange Rate: With a free API key, you can get more API calls than you will ever need. Here is the URL for US Dollars to Hong Kong Dollars:
  4. History Trivia: Get a trivia fact about a random year in world history: or a random date:
  5. Random Number Trivia: Get a trivia fact about a random number.
  6. Chuck Norris Jokes: Ok, this one returns JSON, not plain text, but it’s short enough to parse it visually quite easy. Plus its just too funny to pass up.
  7. Profound Quotes: Sometimes is just nice to hear some Nietzsche and then some Twain. Chat with the greats:
  8. Yes/No Decision Making: Can’t decide whether you should do something? There’s an API for that. This is another JSON API, but since the response is so short, it is still easy to read.
  9. Magic 8 Ball Decision Making: For the tough questions, you need a little magic. This one also returns human-friendly JSON, and can only answer one question per Slack command, so make it a good one!…%3F
  10. When does Donald Trump Leave Office? If you read the news in America, you may find yourself wondering this. Now Slack can answer this question at a moment’s notice.

Have fun! And remember, “with great power comes great responsibility.

12 Simple Google Spreadsheets Sparkline Examples

I just finished a project where I was using IPython and Pandas a lot more than spreadsheets. I found that after using Pandas, I started to think about data a little differently. When I returned to Google Spreadsheets, I realized it wasn’t capable of displaying the data in the way that I wanted to. So I took another look at the Google Spreadsheet SPARKLINE function and found that, in many cases, they provide a more informative and much quicker visualization that charts. Once I got started, I figured I would exhaust the possibilities of the feature and share them here so that you can get started with them quickly and easily when you need to.

There are four different types of sparkline charts. I have a section for each:

Horizontal Bar Charts

Single Series Data

Best for:  Comparing every cell in a column


Basic Bar Chart


Google Spreadsheet Bar Chart Sparkline

It’s pretty straight forward. Make sure to set the max option so that the bar does not take up the full column.



Reverse Direction with Max Argument and Color

=sparkline(E42,{"charttype","bar";"color1","teal";"rtl",true; "max",max(C$36:C$44)})

rtl (right to left) changes the direction of the bars. Use the max function for the value of max with the data column as the range to set the width limit to the highest value in the data column.


Stacked Series Data

Best for:  Comparing multiple columns with each row as parts of whole


Stacked “Versus” Chart



Set color to hex codes for to make visually appealing sparklines. No value for max will set the width of the bar to the width of the cell.

Stacked Bar Chart


The Q1 and Q2 values for the 2014 row do not have numerical values. Setting nan to “ignore” will exclude these values from the sparkline bar chart. The max option is set to the highest value of the Total column.



Line Charts

Single Series (Y) Data

Best for:  Displaying time series data where value change over consistent time intervals

Standard Line Chart and the Effects of ymin

=SPARKLINE(B3:G3,{"charttype","line"; "color","indigo"; "linewidth",2})
=SPARKLINE(B3:G3,{"charttype","line";"color","indigo"; "ymin",0})

Sparkline Line Chart Custom Color

This chart uses a custom color and linewidth. It also demonstrates how to change the chart’s background color by changing the cell’s background color. Both charts display the same data, but because it sets the ymin option to 0, the chart looks much different. Be careful when setting limits and cell dimensions. These adjustments can make the data appear to say different things.


Line Chart with Conditional Coloring

=SPARKLINE(C7:H7,{"color",if(H7>C7,"green","red");"ymax",100; "linewidth",2})

Sparkline Line Chart with Conditional Color

The color is set to an if formula. This formula compares the first and last value of the series, and if the last value in the series is greater than the first value, the line color option is set to green; and if not, the color is set to red. The slope formula is also good for this, but it is better with X,Y series data.


Dual Series (X and Y) Data

Best for:  Displaying time series data where value change over inconsistent time intervals

Line Chart with Conditional Coloring

=SPARKLINE(C12:H13,{"color",if(slope(C13:H13,C12:H12)>0,"green","red"); "linewidth",2; "ymax",25; "xmax",15})

Sparkline Line Chart with Two Variables and X Limit

There is a lot going on here, but it builds on the previous example. The slope function is used to set the color for the line. If the slope is greater than 0 the line is green; else it is green. Additionally, ymax and xmax are set to values greater than any data value. This creates a margin around the line chart.


Line Drawing… Because you can

=SPARKLINE(C26:O27,{"color","red"; "linewidth",8})

Sparkline Line Drawing

The lines will follow the X and Y coordinates even if they are not consecutive. Go nuts!



Win/Loss Charts

Best for:  Displaying binary data like wins and losses

Custom Colors and Colored Axis


Sparkline Win Loss with Custom Color

Colors help draw attention to the data more quickly than just shapes.


Custom Colors and Colored Axis


Sparkline Win Loss with First and Last

The firstcolor and lastcolor options are a good chance to use custom coloring. These can be helpful for summaries of a series of data like whether a team was over or under .500.


Custom Colors and Colored Axis


Sparkline Win Loss with Filter

The nan and empty options determine how non-numeric or empty data is treated.  In this case, the first cell is empty, and the fifth cell is text. Both cells are ignored.



Column Charts

Best for:  Comparing a series of values against each other

Dealing with Negative Values


Google Spreadsheet Sparkline Column Chart

Column chart sparklines offer two ways to call out negative values. The first is to set the negcolor value and the second is not to display them altogether by setting the ylim value to 0.


High and Low Value Accent Colors


Google Spreadsheet Column Chart Sparkline

The highcolor and low color options make is easy to identify the high and low values in the data series.


Bonus: Hex Color Options

Google Spreadsheets is great in that it allows you to customize alot of the visual display. Here are a few nice color palettes to make you data a bit more visually pleasing. For more, checkout:

#69D2E7 #A7DBD8 #E0E4CC #F38630 #FA6900

#41F4E3 #73A4D3 #F0D9A8 #70109 #EB4967

#00A0B0 #6A4A3C #CC333F #EB6841 #EDC951

#ECD078 #D95B43 #C02942 #542437 #53777A


I hope that covers any visualization that you might need to use. Let me know in the comments if you are having any trouble. Tomorrow I will publish a post on how to make funnel diagrams using the Sparkline Function.

Website Launch Checklist for Digital Marketing in 2017

Website launches are a magical time- full of promise and wonder. They are a stress-free time where every stakeholder rejoices and relaxes, knowing that the hard work is behind them. Now all they have to do is let the smooth breezes of the digital ocean carry them to the land of Internet business success.

If you have ever been involved in a website launch you’ll probably recognize that I just lied to you. I’ve seen it from the development agency side, from the digital marketing agency side, and from the business side. And I’ve spent my Friday night putting out flames caused at launch time. (Pro tip: Don’t ever launch of a Friday. Just don’t.)

Website Launch 2017

More often than not, at launch time there are more questions than answers. Maybe they are the right questions… and maybe they’re not.

The idea behind this checklist is to help you ask the right questions and to ensure that you’ve made the right investments in digital marketing in the short and long term.

The good news is: every item on this list can be done for free! And much of it can be handled well in advance of launch (or even shortly after). It is also prioritized to maximize the juice to squeeze ratio. (See then below for more on prioritization.)

What this Checklist Covers

This checklist will help you prepare your site with all the on-site and off-site components of a robust digital marketing machine. It takes into account immediate needs of measurement and basic SEO and sets the groundwork for future success with online advertising and advanced analysis.

The checklist focuses on four key areas of digital marketing:

  • Measurement and Analytics – Analytics tools that allow site owners to identify problems and opportunities, and drill into ways to address them. In my not-so-humble opinion, this is the #1 prerequisite to digital marketing success.
  • Search Engine Optimization – Infrastructure that makes a site more search engine friendly, and metadata to optimize appearance on search engine result pages (SERPs).
  • Social Media and Social Commerce – Components that encourage interactivity and sharing, and metadata that optimizes site content for display on social media platforms.
  • Ongoing Engagement and Advertising – On-site data collection and opt-ins that enable detailed customer segmentation and deep audience engagement.

The list contains some items that are fundamental to every website, and there are others that are more or less important, depending on you digital strategy and operational capacity.

This list could be much longer if you want to get into things like accessibility, branding, in-depth technical SEO, and user experience (which all do affect digital marketing success) but the intent is to keep the list to 20 items and remain focused on the efforts that will have a more direct impact. That is also why it is prioritized.

Prioritization is based on a balance between the effort of implementation and the impact on digital marketing efforts. This matrix tends to be a very reliable heuristic for decision making and prioritization with digital marketing efforts, and this is no exception.

Alright, let’s get on with it! The list is covered in sections and prioritized at the bottom of the article. Get your box-ticking Pen’s Ready!

Measurement and Analytics

Data is fundamental to digital marketing. So measurement comes first because it is the foundation that allows us experiment, iterate, and improve confidently. Free tools can take you the first 80% of the way with some customization, and to get near-perfect fidelity, paid tools are very cost-effective. These free tools will help you understand the “what,” but also they “why.”

  • Google Tag Manager (GTM)

Priority: 1 Dev Required Effort: Low Value: High

A few years ago, this would have been, “Install Google Analytics,” but these days,
installing Google Tag Manager takes top priority because it diminishes the need for development resources for a large part of this list. It also allows you to be more agile when it comes to tracking beacons/tags/pixels and even structured data for SEO and social plugins. It also reduces the cost of site updates or migrations because it sets up a framework for tracking that can be used well into the future.

  • Google Analytics Page Tracking

Priority: 1 No Dev Required (w/ GTM) Effort: Low Value: High

It’s nearly 2017. I shouldn’t need to tell anybody why you need to have Google Analytics on your site. But in case you have been under a rock for the last ten years, Google Analytics allows you to analyze inbound marketing channel performance, user behavior, and on-site conversion rates. And with GTM , it only takes a few clicks to set up.

  • User Session Recording

Priority: 2 No Dev Required (w/ GTM) Effort: Low Value: High

If Google Analytics answers the “what,” session recording plugins answer the “why.” Even if they seem creepy, they are as close as you can come to anonymous over-the-shoulder user observation. These are great for understanding the pain points in the sign-up or checkout processes and give you great ideas for experience optimization texts. I have a couple of favorites in this space, both of which are free and offer pretty great customer service: Inspectlet is good, and HotJar offers a bit more in its free package.

  • Google Analytics Filters, Goals, and Demographic Reports

Priority: 2 No Dev Required (w/ GTM) Effort: Low Value: Moderate

These three features are not retroactive. That is what makes them so important to set up when a site launches. They are all processed during collection, unlike segments. So set yourself up for measurement and analysis success by setting up common filters like IP exclusions and domain partitions; common goals like product, lead form, or transaction pageviews; and demographics reports to gain a deeper understanding of your site’s users.

  • Google Tag Manager Data Layer & Events

Priority: 3 Some Dev Required (+ GTM) Effort: High Value: High

If demographic reports seem interesting, then you are going to love having a comprehensive data layer on our website. Fill your data layer with all the variables that your CMS or ecommerce platform has about the user, page, or product as the user navigates through your site to enable deeper analysis and smarter remarketing and conversion tags. Google Tag Manager can enable event tracking for common interactions like clicks and form submits without the need for developer help. If you are not using GTM, Autotrack is another good option.

Search Engine Optimization

SEO is all about relevancy and ranking. It is a site owners jobs to provide as signals as possible about why our site is relevant to a target search term and why it should rank for it. Search engines consider a multitude of signals these days. There are some that coincide with usability that should be baked into site design, like responsive design, HTTPS, pure site speed, internationalization, user sitemaps, and 404 pages. And there are others that serve primarily as signals of relevancy and ranking. These are at the top of that list:

  • Robots.txt file and XML Sitemap

Priority: 1 No Dev Required Effort: Low Value: High

Yes this is basic, and it goes without saying, but these two files are too often forgotten. The robots.txt file and  XML sitemap are like invitations for search engines to index your site. It is the first step in SEO because you have to be indexed before you rank. There are some technical optimizations that you can make to your sitemap, but fundamentally, any sitemap is better than no sitemap. Most CMS’s will generate a sitemap for you, usually with the file extension “/sitemap.xml.” Otherwise, you can make one online or use Screaming Frog as well and upload it to your site.

  • Google and Bing Webmaster Tools

Priority: 1 No Dev Required Effort: Low Value: High

Google and Bing Webmaster tools are the closest most people will come to directly communicating with search engine indexes and algorithms. These tools are invaluable for SEO because they will provide insight on how search engines, index, evaluate, and rank your site. Setup email forwarding and keep an eye on these because you SEO success depends on it. Also, while you’re at it, don’t forget to configure search console data in Google Analytics.

  • Submit your XML Sitemap Google and Bing Webmaster Tools

Priority: 1 No Dev Required Effort: Low Value: High

If an XML sitemap is an invitation an invitation to search engines to index your site, this is like dropping that invitation at the doorstep of your invitees. Being explicit about where your sitemap is will help ensure all your pages, even deeply linked pages get a fair chance to be indexed. While you’re in your webmaster tools, take a look at the Index Status report to see how many of your site’s pages are indexed. If you have new pages that need to be indexed quickly, you can also manually add them to the index.

  • Essential Content and Markup Optimization

Priority: 1 Some Dev Required Effort: Med Value: High

Google and Bing are intelligent machines but still they are only machines. They need as many signals as they can get to understand the relevancy of your site and its pages. Without going into too much detail, at the very least, ensure that your pages have unique and descriptive title tags, URLs, image alt attributes, and h1 tags that tastefully contain keywords the people use in search. Additionally, strive for at least 100-300 words of unique content on each page. But if you don’t have anything useful to say to the reader, just skip the spam creation. Finally, ensure that your site has a “shallow” architecture meaning that Googlebot doesn’t have to follow too many links just to find your site’s content, and use descriptive link text to help searchers and bots understand the contents of the site. See: SEO 101.

  • Claim Your Business on Google and Bing

Priority: 2 No Dev Required Effort: Low Value: Moderate

Claiming your business is primarily for local business websites but can be helpful for indexing sites of any kind. Google My Business and Bing Places for Business are important signals for local search algorithms that that brick-and-mortar business depend on. These will get your business onto Google Maps and whatever Bing has for maps, giving your site and your business another opportunity to be found. Related to this would be, if you haven’t, to claim all of your social media profiles for your brand name and point the profile links to your homepage. This will help local and non-physical businesses with search engines’ so-called Knowledge Graphs. More to come on the below.

  • Structured Data and the “Knowledge Graph”

Priority: 4 No Dev Required Effort: Moderate Value: Moderate

The Knowledge Graph is a term that describes Google’s algorithm’s understanding of the Web and all the things that are represented in it. The key idea for SEO is ”things, not strings.” The Knowledge Graph is ever-evolving, and so are search engine’s expectations of websites who want to integrate into it. The current, and hopefully future, method of signaling what your website and its pages represent is based on schema.og and JSON-LD. Within this embedded JSON script, you can provide signals about everything from who your company’s CEO is, to how long a recipe takes to complete. With this information, search engines can provide what are called “rich results,” such as star ratings, image previews, and breadcrumb navigation in search results.

  • Image and Video XML Sitemaps

Priority: Depends Some  Dev Required Effort: Moderate Value: Depends

If your site is very image or video-driven and features a lot of unique content, then image and video search are likely to be very critical to your success. Image sitemaps and video sitemaps are similar to regular XML sitemaps but instead of providing references to pages; they provide references to images and videos. This help to ensure that all your media is indexed and presents an opportunity to provide a little bit more context to search engines about each media file.

Social Media and Social Commerce

This is meant to cover two different spheres of “Social”: Social Media, the interactivity with a brand and the media associated with it; and Social Commerce, the interactivity between humans with the intent to improve customer experience and facilitate commerce. Both “trends” are here to stay and will continue to evolve. So set yourself up right and get social!

  • Fast and Simple Share and Follow Buttons

Priority: 2 No Dev Required (w/ GTM) Effort: Low Value: High

You’ve put a lot of effort into launching a great website and now you are thinking about marketing. There is not a more trusted form of advertising than social sharing. And best of all, it’s free! The only thing you have to do is give people a (good) reason to share and make it easy. At a minimum, you can utilize a plugin like AddThis and drop it into your site with Google Tag Manager. If your site/product is strong on images and can attract a Pinterest audience, don’t forget your Pinterest button. And if you want to provide a unique experience, you can plug right into Twitter’s Web Intents API or Facebook’s Open Graph Stories API.

  • Social Customer Service

Priority: 3 Dev Required Effort: Medium Value: High

The fact is, people are going to mention your brand on social media, so it is important to take the steps that put your brand in a position to interact with your customers in the way that is best for your brand and your customers. This used to mean having an email link, and if you were really nice, a phone number to call (it still does). But now, because the pace of online business is increasing and customers expectations are changing, customers want to use a platform that is more comfortable with consideration to their device and their communication preferences.

To avoid a twitter bomb scenario, offer your customers an easy social medium to interact with you. Two free ways of doing this, depending on your audience, are Facebook Messenger’s platform and Skype. If you want to step it up a notch ($), have a look at Intercom which is a platform built for just this.

  • Social Meta Tags

Priority: 4 Dev Required Effort: Med Value: Med

If you have invited users to express their opinion about your site by providing sharing widgets, you should also recognize your responsibility to manage your site’s appearance in social shares. Social Meta Tags are the signals that social media platforms use to display images and in-depth descriptions and eye-catching imagery of your site’s content rather than just a URL. This makes your social posts appealing and increases the likelihood that new users will come to your site.

  • Custom Shortlinks

Priority: 5 No Dev Required Effort: Med Value: Low

Ok, this is more “cute” than anything, but it is a nice and subtle branding mechanism that signals to your audience that you are mobile savvy. Bitly’s custom branded short links allow you to share your site’s content with a nice branded short URL that signals to your customer where the link goes and track engagement at the same time. Like I said, it’s cute, but it’s cool.

Ongoing Engagement and Advertising

As tech evolves to enable higher degrees of interaction, consumers start to expect more timely and personalized messaging and care. To take advantage of these new technologies, there is a foundation of data collection and explicit (and implicit) user opt-ins that need to happen. Here are a few of these ever-evolving technologies that increase audience engagement.

  • Email Capture

Priority: 1 Some Dev Required Effort: Med Value: High

This might seem like bait and switch but commerce over email is easier, cheaper, more effective, and even more appropriate for some products/audiences. Even if your email strategy is not very sophisticated yet, this form of data acquisition will allow you to employ a newsletter, programmatic remarketing or solicit consumer feedback surveys.

What’s more interesting is that, since the email address is the ID card of the internet, email lists will enable you to understand your user/customer demographics in a new way. Email lists can be used to create Facebook advertising audiences. Facebook will then provide you with demographic data for these lists. You can even take this a step further and buy more demographic data about your audiences for further segmentation. To get started for free: try this API to determine the gender of your customer (requires a first name so collected it too or use this name deducer API if you don’t.)

  • Remarketing Audiences

Priority: 2 No Dev Required (w/ GTM) Effort: Medium Value: Low

Facebook Remarketing Audiences give you a better understanding and better interaction with your website’s audience. As mentioned above in Email Capture, creating Facebook Remarketing Audiences will give you demographic insights into your audience and can allow you to specifically target them based on the content they have viewed or the actions they have taken on your website. This type of remarketing can also be achieved through Google’s display network. The good news about that is all you have to do is setup your Google Analytics correctly, and you are all set.

  • Push Messaging

Priority: 4 No Dev Required (w/ GTM) Effort: Low Value: Low

I honestly can’t believe this isn’t taking off faster. Push messaging offers an audience /customers engagement method alternative to email that has both lower friction sign ups and a less noisy messaging space. Additionally, this medium is inherently more immediate than email or social because it does not depend on the user to be reading email or on a social media platform. They just have to be on their device. Additionally, this can be tied to your RSS feed to send notifications for news or blog articles programmatically. But remember, with great power, comes great responsibility. Don’t be spammy. For a free recommendation, try OneSignal. It’s not a boy band, but it is very comprehensive and completely free service.

  • RSS Feeds

Priority: 4 Dev Required Effort: Medium Value: Low

RSS feeds… they used to be a lot cooler. These days, email, push, and social media have reduced the need to keep up with your favorite blog with an RSS feed, but there is still some value to this. If nothing else, they are essential to triggering email alerts, push notifications, and/or social media alerts to your audience.* RSS feeds also enable your loyal and tech-inclined fans to use services like IFTTT to help spread your content.   *It goes without saying, you must be sensible with automation.

End Note

You might be thinking to yourself that this is a mountain of work considering the work it takes just to maintain a website. The key to this, as in any strategy, is to determine the impactful efforts from the speculative ones. This largely depends upon your site, your audience, and your business model. Consider what goal you are trying to achieve and ask yourself if each of these efforts will help you achieve it. But when it comes down to measurement and analytics, the answer is always “yes.”

Add a Website Skype Button to Talk With Leads and Customers

This post demonstrates several ways to add click-to-call and click-to-chat Skype button functionality into your website. Beyond just being a modern way to use a web browser, it enables a modern approach to interacting with leads and customers. There is good reason to do this. Let me tell you why…

Example Website Skype Button

A New Definition of “Social” Brands

The meaning of “Social” in terms of business and e-commerce is changing. There are a couple of trends that are having a big influence of this shift.

The first trend is that there are so many social and messaging platforms available, we are allowed to pick and chose the level of privacy and involvement we associate with each one. For example, I use Facebook and Whatsapp for close friends and LinkedIn and Skype for business. (No surprise they are both owned by Microsoft now) These expectations are evolving considerably with the birth of ever new platform and the generations that are native to each.

The second trend is social commerce; the consumer expectation that a relationship with a brand will be not only tangible but also responsive. There is a higher expectation of interactivity An example of this is how Facebook Pages are reporting response times – putting a brand’s level of interactivity on display for all to see.

So what does it mean? Social does not mean just sharing.Social means interacting.

This post offers an easy way to become more interactive with your customers.

Read More