Archive

January 3, 2019

I’ve been debugging a few Dynamic Remarketing issues for clients lately. Here are a few thoughts on what I’ve learned. NB: This refers to Google Analytics Dynamic Remarketing where audiences are setup and defined in Google Analytics. It is not the same as AdWords Dynamic Remarketing even though the purpose is very similar. I’ve only worked on retail clients in this area so this post skews towards their concerns/problems. When you tag your site for Dynamic Remarketing you are essentially providing hints for Google’s machine learning system which then targets adverts based on what they think the user wants and what they think you’ll pay the most for. The most common error I see is people using a different product ID for Dynamic Remarketing than what is in the Merchant Centre feed. The product ID you send in the `ecom_prodid` dimension must match one of the following Merchant Centre fields: id, item_group_id or c:drid If the only product ID’s you can send to Google Analytics do not match the Merchant Centre id or item_group_id (e.g. you are using tag manager to pull them out the DOM and the page doesn’t contain the right fields) then add the ID’s to the

January 2, 2019

This site uses the static site generator Hakyll. Hakyll is a bit of an unusual choice here (note to self: future blog post) which means that static site generator services like Netlify won’t run the generation process for me. They can host the pages, but I’d have to run the script which generates the HTML myself. This is fine on my own computer, but I’d like to be able to add posts from places or machines where I don’t have Hakyll installed. I use circleci to build the site and then Amazon S3 to host the pages. circleci will automatically run whenever a commit is made to the repository for this site and then upload the results to Amazon where people like you can see them. Here is the circleci build script (see github for the latest version) jobs: build: docker: - image: fpco/stack-build steps: - run: apt-get update && apt-get -qq -y install awscli - checkout - restore_cache: keys: stack - run: stack setup - run: stack install - save_cache: key: stack paths: - .stack-work - ~/.stack - restore_cache: keys: site - run: ~/.local/bin/eanalytica-www build - save_cache: key: site paths: _site - run: aws s3 sync _site s3://com.eanalytica.www2/

November 8, 2018

When you install Tensorflow in the normal way you will often get a warning message like “The TensorFlow library wasn’t compiled to use SSE4.2 instructions, but these are available on your machine and could speed up CPU computations” or something like that. This tells you that the Tensorflow computations could be optimised further but these optimisations aren’t used by default because they are only available on some CPU architectures. Tensorflow has a guide on how to install from source with these features enabled but the process is tedious and it is not the Nix way. If you look at the Nix derivation for tensorflow you can see options for enabling SSE4.2, AVX2 and FMA support; the question is “how do I enable these in my own decleration?” Here is an extract from the old version of my nix file: pandas scikitlearn Keras tensorflow numpy h5py scipy jupyter conda statsmodels matplotlib seaborn nltk beautifulsoup4 flask psycopg2 ]; python3packages = pkgs.python3.withPackages pythonPackageList; All we need to do is change the tensorflow variable to be what we want it to be: { overrides = self: super: { tensorflow = super.tensorflow.override { sse42Support = true; avx2Support = true; fmaSupport = true;

February 29, 2016

Last week I enjoyed reading Andrew Goodman’s SEL piece on The Death of Search Marketing Expertise. For those of you who haven’t read it, Andrew laments the lack of knowledge/expertise among modern agency owners. Andrew’s hypothesis is that this is a result of agency owners caring more about the growth of their business than the “craft” (my word) of marketing. Which I think is probably true. This leads to a worse level of service for the agency’s clients because the owner sets the culture. So the employees of the agency don’t get the mentorship and development they need to grow in their roles. On Twitter, my “in group” are PPCers who are at the “craftsman” end of the spectrum (out of the industry as a whole this is a minority). Andrew’s piece is like catnip for us because the “clueless idiot who makes a tonne more money than me” is always a good out group to position against. But no matter how much we dislike it, I think the clueless agency owner is here to stay: Assume there is no relationship between PPC ability and ability to run an agency. If you plotted these two axes on a chart (too

February 1, 2016

Kirk Williams recently published a post on PPC Hero about PPC account setup for ecommerce clients. It is a good post and you should read it; the main takeaway for me is a reminder of how many things it is possible to do in an ecommerce account. It also made me think a bit because my approach to a new account build is different; this post explores which one of Kirk or myself is wrong. Spoiler this is an “it depends” situation. Firstly, what is my approach? As with most things PPC, I don’t have a process but instead a set of principles which can be applied to generate something that looks a bit like a process from the outside. This is, in some ways, a major weakness of my practice and would be something that would need urgently fixing if I were to try to scale my one-man-band consultancy into an agency. 1 Account Build Principles 1.1 An account is never finished There is no point at which a PPC account is “done”. There is always more keywords that could be added, more ad groups that could be segmented, more ad extensions that need to be tested.