Running an android emulator on Windows

Today I wanted to use an Android emulator and use the Charles Web Application Proxy to debug headers being sent over the wire. Here’s how to get an Android emulator running on your Windows machine.

Useful links

Edit: There are more useful things out there now Microsoft and Android play together a bit more nicely. Check out the Xamarin suite – especially the Xamarin Live Player. (Cheers Chris!)

Match Fiddler AutoResponder on body content

It’s possible to get Fiddler to manipulate your responses before they are returned to your application when Fiddler is acting as a proxy to your application.

Interestingly I couldn’t get the Chrome Advanced REST Client to go through Fiddler, but Chrome PostMan worked without any issues.

After you’ve setup your app to use your local Fiddler port, open up the AutoResponse tab.

  • Tick the enable rules checkbox to use the AutoResponse feature
  • Tick the Unmatched Requests Passthrough to enable other responses that don’t match a rule to continue to their destination
  • Items at the top of the list will get matched first
  • The first item that is matched will return the configured response
  • You can drag items from the traffic history into the responses dropdown, and edit the response. Really cool!

Here are some useful matches:

  • URLWithBody:REGEX:example.com REGEX:example
    This lets you match the url and body with regular expressions
  • Header:MyHeader=exactmatch
    Match a particular header

Struggling matching? It might be an encoding thing or you might want to see more rules (tip: click the dropdowns in the Rules Editor to see some built in examples).

 

Force .NET application proxy

Forcing a .NET application proxy via web.config or app.config

This lets you force a proxy for a .NET application including web sites, running under any user or service account. Very useful for proxying via Fiddler.

<system.net>
   <defaultProxy>
     <proxy autoDetect="false" bypassonlocal="false" proxyaddress="http://127.0.0.1:8888" usesystemdefault="false" />
   </defaultProxy>
</system.net>

Mocking and testing live responses

Instead of doing this, I highly recommend using a mockable proxy such as MockServer or running it in memory using an OWIN based hosted mocking server like JustFakeIt and running tests in memory.

Deserializing different types using Newtonsoft JSON.NET

I want to be able to serialize and deserialize different types in my datastore json. This is how you can do it in .NET using a custom SerializationBinder. Json.NET uses the .NET SerializationBinder to work out custom types.

This relies upon a $type field being added to objects in your JSON, but this is probably going to be OK since we don’t need types for arrays or simple types (like string or int).

Lets start off with a test to explain what I’m doing:

     [Test]
        public void SimpleStepAreDeserializedIntoCorrectTypes()
        {
            var knownTypesBinder = new KnownTypesBinder();
            knownTypesBinder.AddAssembly(Assembly.GetAssembly(typeof(KnownTypesBinder)));

            var json = JsonConvert.SerializeObject(_template, Formatting.Indented, new JsonSerializerSettings
            {
                TypeNameHandling = TypeNameHandling.Objects,
                Binder = knownTypesBinder
            });

            var deserialized = JsonConvert.DeserializeObject(json, new JsonSerializerSettings
            {
                TypeNameHandling = TypeNameHandling.Objects,
                Binder = knownTypesBinder
            });

            Assert.IsInstanceOf(deserialized);
            Assert.IsInstanceOf(deserialized.Steps[0]);
            Assert.IsInstanceOf(deserialized.Steps[0].PostValidationRules[0]);
            Assert.IsInstanceOf(deserialized.Steps[1]);
            Assert.IsInstanceOf(((StoreNewItemStep)deserialized.Steps[1]).WhatToDo[0]);
        }

Here’s the implementation. Inspiration taken from the [Type Converting thread|http://json.codeplex.com/discussions/56031] and the [Custom Serialization Binder|http://www.newtonsoft.com/json/help/html/SerializeSerializationBinder.htm] page which this is based on.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Reflection;
using System.Runtime.Serialization;

namespace Flow.Library
{
    public class KnownTypesAssemblyBinder: SerializationBinder
    {
        private readonly IList _knownTypes = new List();

        public void AddAssembly(Assembly assembly)
        {
            foreach (var type in assembly.GetTypes())
            {
                _knownTypes.Add(type);
            }
        }

        public override Type BindToType(string assemblyName, string typeName)
        {
            var result = _knownTypes.SingleOrDefault(t => t.Name == typeName);
            return result;
        }

        public override void BindToName(Type serializedType, out string assemblyName, out string typeName)
        {
            assemblyName = null;
            typeName = serializedType.Name;
        }
    }
}

This absolutely passes with no issues. Note the $type attribute in the example output below:

{
  "$type": "FlowTemplate",
  "Id": null,
  "Name": "Example Flow Template",
  "Version": 0,
  "Modified": "0001-01-01T00:00:00",
  "Author": null,
  "Tags": null,
  "Steps": [
    {
      "$type": "FormCollectionStepTemplate",
      "Form": "Step 1 Form",
      "Id": "Step 1 Id",
      "Version": 111,
      "Name": "Step 1 Name",
      "Author": "step1@example.com",
      "Tags": null,
      "PreValidationRules": [],
      "PostValidationRules": [
        {
          "$type": "RegexValidationRule",
          "IsValid": false
        }
      ]
    },
    {
      "$type": "StoreNewItemStep",
      "WhatToDo": [
        {
          "$type": "StoreWhatWhere",
          "SourceKey": "name",
          "Collection": "customer",
          "Key": "name"
        },
        {
          "$type": "StoreWhatWhere",
          "SourceKey": "email",
          "Collection": "customer",
          "Key": "email"
        }
      ],
      "Id": "Step 2 Id",
      "Version": 222,
      "Name": "Store Customer Information",
      "Author": "step2@example.com",
      "Tags": null,
      "PreValidationRules": [],
      "PostValidationRules": []
    }
  ],
  "Links": [],
  "Groups": []
}

I’m Leaving JustEat (true story)

Today I sent my leaving letter around work. It’s a shame I couldn’t send it to the whole department but it feels nicer to hand select the people I wanted to send it to.

Dear Friends and Colleagues,

After two very happy and successful years at Just Eat I have decided to hand in my notice and try something new.

It’s been fantastic working with you and this has been the best company I’ve ever worked for. I’m so very thankful to have met you and to have delivered real change together on the platform.

My leaving drinks are on Friday in the Wine Cellar, a hidden gem underneath Holborn Viaduct. I hope you’ll be able to come and say goodbye!

Many thanks,
Alex

Configuring SSH keys for Github using Windows 10

You’ll need to install:

Create an ssh key and set it as your default:

  • Run ssh-keygen
  • Create a key. Name it id_rsa. You’ll get a private key (id_rsa) and a public key (id_rsa.pub). Never disclose your private key. Share your .pub key instead.
  • Upload the id_rsa.pub github.com ssh keys
  • Save the private id_rsa file in your ~\.ssh folder
  • Restart your terminal if it doesn’t work

Fixing Powershell Params “The input to an assignment operator must be an object that is able to accept assignments”

Problem: The input to an assignment operator must be an object that is able to accept assignments

Write-Host "Ahoy hoy!"
param(
    [Parameter(Mandatory=$true)][string]$path,
    [Parameter(Mandatory=$true)][string]$environment="example" # error here
)

Solution: You’ve got a problem with your params. In my case I had a statement before declaring the params section. Params should be at the start of the file, or at the start of a function.

param(
    [Parameter(Mandatory=$true)][string]$path,
    [Parameter(Mandatory=$true)][string]$environment="example" # error here
)
Write-Host "Ahoy hoy!"

More information:

Wrapping using statement with dispose in powershell

Today I’m working with zip archives for lambda functions. I needed to take the zip file and add deployment configuration. Lambdas are currently packaged up with their configuration so I need to inject in a file into the zip.

It’s relatively trivial to work with compressed zip files in Powershell, however you need to dispose the handle when you’ve finished with it.

[System.Reflection.Assembly]::LoadWithPartialName("System.IO.Compression.FileSystem")
$disposeMe = [System.IO.Compression.ZipFile]::Open($path, "Update");
# ...
$disposeMe.dispose();

A using statement is really just a try catch that calls dispose if the object isn’t null. Dave Wyatt’s Using-Object: PowerShell version of C#’s “using” statement post has a good example function of where this is all taken care of and uses a $ScriptBlock to execute it.

So instead of manually calling dispose, we can now write a using lookalike, and the handle will be disposed for us. Lovely!

Using-Object ($zip = [System.IO.Compression.ZipFile]::Open($path, "Update")) {
    # ... 
}

If you don’t want to add the function, you can always just wrap a try catch around it.

try { $disposeMe = [System.IO.Compression.ZipFile]::Open($path, "Update"); }
finally
{
  if ($null -ne $disposeMe -and $disposeMe -is [System.IDisposable]) { $disposeMe.Dispose() }
}

Responses for batch results

What’s the best response shape for a batch request?

Depending on your design, batch requests don’t always work and need success and failure responses back from the endpoint.

Salesforce return the item and a success field in each response item:

[
  {
    "success" : true,
    "created" : true,
    "id" : "1",
    "errors" : []
  },
  {
    "success" : false,
    "created" : false,
    "id" : "2",
    "errors" : ['example error message']
  }
]

The AWS DynamoDB response returns a set of metadata, successes and failures:

{
  { 
    "ConsumedCapacity" : 100, 
    "OtherMetric" : true
  },
  { 
    "Items" : [ 
      { "Key" : "Item1" }, 
      { "Key", "Item2" }
    ]
  },     
  { 
    "Failed": [ 
      { "Key" : "Item3", "Reason" : "Value cannot be null" } 
    ]
  }
}

I prefer a third option, which takes the best of both. Return a list of ids you’ve received, so requests can be tracked. Return an array of items that were successful and another list that has failed. The failed messages get an array of reasons they have failed.

{
  "ids" : [ "item1", "item2", "item3" ],
  "success" : [ { id: "item1" }, { id: "item2" } ],
  "failed" : [ { "id": "item2", "reasons" : [ "String required" ] } ]
}

References:

Mounting and diagnosing an EC2 volume from a rescue instance

Here’s a quick way to troubleshoot logs on a failing EC2 with an attached volume.

  1. Remove the instance from any load balancers so it won’t be dropped from a healthcheck
  2. Take a snapshot of the volume you’re having trouble with
  3. Create a rescue instance in the same AWS Account and using the Ec2 Add Storage step attach a new volume which uses the snapshot you created mapped to device xvd*
  4. Start up your instance and connect remotely
  5. Using diskmgmt.msc browse and mount the drive
  6. Use file explorer to find your device and explore as a regular hard drive

Having trouble mapping your device? See Mapping Disks to Volumes on your EC2 instance for the documentation on the device names.