Sidebar Menu

Projects

  • Dashboard
  • Research Project
  • Milestones
  • Repository
  • Tasks
  • Time Tracking
  • Designs
  • Forum
  • Users
  • Activities

Login

  • Login
  • Webmail
  • Admin
  • Downloads
  • Research

Twitter

Tweets by stumathews
Stuart Mathews
  • Home
  • Blog
  • Code
  • Running
  • Gaming
  • Research
  • About
    • Portfolio
    • Info
Details
Category: Code
By Stuart Mathews
Stuart Mathews
25.Mar
25 March 2018
Last Updated: 01 April 2018
Hits: 7221

XML to Pandas Data frame

I was given a 256MB XML file parse and extract details from recently at work. The idea is to use it to prepare data that we'd like to send to out backend API. 

The XML format looks like this:

<DataKind1>
    <thing1>hello</thing1>
    <thing2>world</thing2>
</DataKind1>
<DataKind2>
    <other1>hello</other1>
    <other2>hello</other2>
</DataKind2>
<DataKind3>
    <other3>hello</other3>
    <other4>hello</other4>
</DataKind3>

So you can see that its an XML file that contains different types/kinds of data - identified by its top-level node name.

And basically, these kinds of data repeat themselves throughout the file for 256Mb. And I want to use my data analysis skills to splice and chop up the specific data I want(specifically grouping some data) before sending it up.

So What I wanted to do was get each kind of data into a data frame, this meant of course that I needed to parse the XML and get it into Series objects and then import those series objects into creating a new data frame of all the series objects - one for each field in the data kind. I'm probably not explaining it very well and maybe this code might be more insightful:

import sys, getopt, xml.etree.ElementTree as ET, pandas as pd, time
from lxml import etree
import time

start = time.time()

holding_file = "HoldingsSummary.xml"
program_name = "AdjustHolding.py"

def print_el(el):
	print(etree.tostring(el , pretty_print=True))

def el_to_dict(el):
	dict = {}
	for child in el:
		dict[child.tag] = child.text
	return dict

all_top_level_elements = etree.parse(holding_file).getroot()

dataKinds = {};
headersFor = {};

for element in all_top_level_elements:
	el_dict = el_to_dict(element)
	if not element.tag in dataKinds:
		dataKinds[element.tag] = []; 
		headersFor[element.tag] = el_dict.keys()
	else:		
		dataKinds[element.tag].append(el_dict.values())
dfs = {};
for name in dataKinds.keys():
	tuple_rows = [tuple(lst) for lst in dataKinds[name]]	
	cols = headersFor[name]
	if(len(tuple_rows) > 0):
		df = pd.DataFrame(tuple_rows, columns=cols)
		dfs[name] = df

# Each kind of data can now be delt with by fetching the created data frame for it.
dk1_df = dfs['DataKind1']
print(person_df.head())

I guess the key thing for me is that I can now work with pandas data frames instead of the XML for each data frame, making higher level data analysis techniques available to me such as setting the series datatypes, running aggregations and grouping.

There are limitations, of course, I don't need to parse multi-level XML nodes, however, if I needed to do that I'd re-define el_to_dict(element) to do more work than just expecting at most one child element of the top level node, and further parsing the remaining child tree structure effectively flattening it like it do up above.

 Well, the next piece of work starts on Monday, where I'll group by a specific field in the data kind and prepare some API requests for sending that data up to the cloud.

  • Python
Details
Category: Code
By Stuart Mathews
Stuart Mathews
11.Mar
11 March 2018
Last Updated: 11 March 2018
Hits: 3810

AWS Elastic Container Service's Task definitions

I’ve had a pretty busy couple of weeks recently and haven’t had much time to do some things that I’d wanted to continue with. However this weekend I had some time to think a little bit more about my investment service that I’ve been writing.

Previously, I’d successfully deployed my .Net Core 2 Web API into Amazon ECS using Travis-CI. It automatically builds the Docker container, and then create the AWS service, a Task Definition and finally the ECS cluster. What I still need to do is create a Load balancer to manage this.

What I wanted to do is rationalise some of the ideas/concepts that I'd previously taken for granted in the persuit of just-getting-it-done. It also gave me a bit of time to understand more about ECS’s infrastructure and how it works, as I'll be doing more work in this area in the coming weeks. 

My initial thought was to effectively describe what a task definition was as this concept is a bit murky and to do this, I need to show you how it fits into the broader ecosystem.

Here is a diagram I drew to explain much of the fundamental concepts of ECS:

Basically when managing an application that deployed as a docker container, you need to understand how your container is organised by ECS.

Fundamentally, managing a Docker image starts with creating a ‘Service’ that is responsible for managing the infrastructure for your Docker image. Each Service will obviously then be associated with your Docker Image. To make this association, you define a Task Definition which represents which Docker image to use and which repository its hosted/located in - so it knows where to fetch it from. Then, and this is the important part, your task definition can be realised or instantiated (as a Task) onto running EC2 AMI container host instances thereby effectively running your image in EC2 host containers (EC2 instance in diagram).

The part I’ve left off, is that you also define a ECS cluster which is just how many EC2 container hosts are available for tasks to run on. These are ECS optimised AMIs provided by Amazon.

The Service will automatically run the tasks on available hosts in the cluster. Normally when you initially create the service, you specify how many tasks must be always running at the same time and the service will ensure that that many tasks are created across all the EC2 instances associated with that service(through the association the service has with a cluster)

What I still need to do is deploy my Angular front end app in the same way. This is what I’m going to be doing in the days that follow.

To give you a bit of an idea how this is programatically achieved in Travis-CI, this is what setups the AWS stuff prior to deploying this to ECS:

Firstly this is my docker file:

#Image(build) that is used to compile/publish ASP.NET Core applications inside the container. 
FROM microsoft/aspnetcore-build:2.0 AS build-env
WORKDIR /app

#Copy BUILD_DIR\*csproj and restore as distinct layers
COPY *.csproj ./
RUN dotnet restore

# Copy everything else and build
COPY . ./
RUN dotnet publish -c Release -o out

# Build runtime image by adding the compiled output above to a runtime image(aspnetcore)

FROM microsoft/aspnetcore:2.0
WORKDIR /app
COPY --from=build-env /app/out .

# Expose port 5000 on container to the world outside (container host)
EXPOSE 5000/tcp

# Ask Kestral to listen on 5000
ENV ASPNETCORE_URLS http://*:5000
ENTRYPOINT ["dotnet", "CoreInvestmentTracker.dll"]

And this is how it gets deployed by Travis-CI:

First setup some environment variables:

#!/bin/bash

# set environment variables used in deploy.sh and AWS task-definition.json:
export IMAGE_NAME=coreinvestmenttracker
export IMAGE_VERSION=latest

export AWS_DEFAULT_REGION=eu-west-2
export AWS_ECS_CLUSTER_NAME=default


# set any sensitive information in travis-ci encrypted project settings:
# required: AWS_ACCOUNT_NUMBER, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY
# optional: SERVICESTACK_LICENSE

First, build the docker file etc.

#!/bin/bash
source ../deploy-envs.sh

#AWS_ACCOUNT_NUMBER={} set in private variable
export AWS_ECS_REPO_DOMAIN=\(AWS_ACCOUNT_NUMBER.dkr.ecr.\)AWS_DEFAULT_REGION.amazonaws.com

# Build process
docker build -t \(IMAGE_NAME ../
docker tag \)IMAGE_NAME \(AWS_ECS_REPO_DOMAIN/\)IMAGE_NAME:\(IMAGE_VERSION

and finally setup the AWS ECS infrastructure:

#!/bin/bash
source ../deploy-envs.sh

export AWS_ECS_REPO_DOMAIN=\)AWS_ACCOUNT_NUMBER.dkr.ecr.\(AWS_DEFAULT_REGION.amazonaws.com
export ECS_SERVICE=\)IMAGE_NAME-service
export ECS_TASK=\(IMAGE_NAME-task

# install dependencies
sudo apt-get install jq -y #install jq for json parsing
sudo apt-get install gettext -y 
pip install --user awscli # install aws cli w/o sudo
export PATH=\)PATH:\(HOME/.local/bin # put aws in the path

# replace environment variables in task-definition
envsubst < task-definition.json > new-task-definition.json

eval \)(aws ecr get-login --region \(AWS_DEFAULT_REGION --no-include-email | sed 's|https://||') #needs AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY envvars

## Check to see if the repository already existsm otherwise create it
if [ \)(aws ecr describe-repositories | jq --arg x \(IMAGE_NAME '[.repositories[] | .repositoryName == \)x] | any') == "true" ]; then
    echo "Found ECS Repository \(IMAGE_NAME"
else
    echo "ECS Repository doesn't exist, Creating \)IMAGE_NAME ..."
    aws ecr create-repository --repository-name \(IMAGE_NAME
fi

# Push the image to the repository
docker push \)AWS_ECS_REPO_DOMAIN/\(IMAGE_NAME:\)IMAGE_VERSION

 # Create a new task revision
aws ecs register-task-definition --cli-input-json file://new-task-definition.json --region \(AWS_DEFAULT_REGION > /dev/null
 #get latest revision
TASK_REVISION=\)(aws ecs describe-task-definition --task-definition \(ECS_TASK --region \)AWS_DEFAULT_REGION | jq '.taskDefinition.revision')
SERVICE_ARN="arn:aws:ecs:\(AWS_DEFAULT_REGION:\)AWS_ACCOUNT_NUMBER:service/\(ECS_SERVICE"
ECS_SERVICE_EXISTS=\)(aws ecs list-services --region \(AWS_DEFAULT_REGION --cluster \)AWS_ECS_CLUSTER_NAME | jq '.serviceArns' | jq 'contains(["'"\(SERVICE_ARN"'"])')
if [ "\)ECS_SERVICE_EXISTS" == "true" ]; then
    echo "ECS Service already exists, Updating \(ECS_SERVICE ..."
    aws ecs update-service --cluster \)AWS_ECS_CLUSTER_NAME --service \(ECS_SERVICE --task-definition "\)ECS_TASK:\(TASK_REVISION" --desired-count 1 --region \)AWS_DEFAULT_REGION > /dev/null #update service with latest task revision
else
    echo "Creating ECS Service \(ECS_SERVICE ..."
    aws ecs create-service --cluster \)AWS_ECS_CLUSTER_NAME --service-name \(ECS_SERVICE --task-definition "\)ECS_TASK:\(TASK_REVISION" --desired-count 1 --region \)AWS_DEFAULT_REGION > /dev/null #create service
fi
if [ "\((aws ecs list-tasks --service-name \)ECS_SERVICE --region \(AWS_DEFAULT_REGION | jq '.taskArns' | jq 'length')" -gt "0" ]; then
    TEMP_ARN=\)(aws ecs list-tasks --service-name \(ECS_SERVICE --region \)AWS_DEFAULT_REGION | jq '.taskArns[0]') # Get current running task ARN
    TASK_ARN="\({TEMP_ARN%\"}" # strip double quotes
    TASK_ARN="\){TASK_ARN#\"}" # strip double quotes
    aws ecs stop-task --task \(TASK_ARN --region \)AWS_DEFAULT_REGION > /dev/null # Stop current task to force start of new task revision with new image
fi

 

 

One very important thing about a Task definition, other than defining which docker image to use, is that you can define the environment variables that the docker image will see when its running(as a Task!). This is very important for me because I define the RDS connection string information in here, which includes passwords etc. This task definition although I define it in the source code, does not have passwords in it, but I update it by updating it's revision and then apply the new task definition to the service and that is then applied. Then then runs this runs tasks using this new revision.

Hopefully I'll be able to get the system setup in such a way I can setup and tear down the system quickly to avoid long running costs while developing. I've read alittle about AWS data pipelines as a way to achieve this so I'll look into that later. In the mean time, its slowly coming together.

Details
Category: Code
By Stuart Mathews
Stuart Mathews
02.Mar
02 March 2018
Last Updated: 04 March 2018
Hits: 3019

Mocking out a function with an Action function as parameter

I’ve been working pretty flat out recently trying to resurrect the effects of a large API change on our Excel Add-in. We’ve changed the signature, return types and behaviour of API in a more intuitive way. This means that I’ve had to refactor all client code that expected the old way of things. This in itself actually was not the hard part, the hard part was actually fixing our tests.

One of the ways we ensure that our functionality is doing what we expect it to do, is that we test it by assessing before and after states of certain pieces of code. This ensures that that functionality hasn’t changed those expectations. We use Moq to do this. So basically we’ve got test coverage for most of our functionality, and these things have an absolute cadenza when these types of API changes happen.

The most interesting problem I had recently was around the mocking nature of these tests. A basic testing principle described above is further complicated when your functionality that you are testing does something behind the scenes that relies on being ‘online’ or ‘in production’ or connecting to a database somewhere. These are all things that you don’t want to have when you’re testing some piece of functionality(which has this dependency). So we can ‘mock out’ that piece of work, and continue to test everything else.

Mocking will allow one to have a say in what happens when that dependency(say fetching data from an API or database) is reached. In most cases you’ll tell it not to go and fetch the data, but rather you’ll instruct it to think its going to fetch the data and give it ‘mock’ data and it’ll use that moving forward, thinking that it had happily gone and fetch the data live from the database/internet.

The data that you faked is used to it is just to get by that external dependency(say to call out to the internet/database), to get past that point in the functionality, then the mocked data is used throughout the functionality under test. We provide this fake data before the test and ensure that our expectations about the test and perhaps how its affected that test data are still in place after the functionality run.

So the particular problem I’ve recently had was to resolve an issue whereyby I’d need to Mock out an API request but its a API request thats quite odd. It contains a function as a parameter. I need not to entirely mock out the entire function, but to do so but still have the function parameter run and me mock that. Huh? – how bizarre. Have a look:

This is the call to mock, or at least its signature - see the Action<IMeta> call as a parameter. I dont want to mock that out completely...

public delegate void GetPortfoliocallSignature(
Scope scope,
Code portfolioId, 
DateTimeOffset? effectiveDate, 
AsAtPredicate asAt, 
Action<IMeta> getMeta);

And this is how it gets called internally, within the test:

IMeta portfolioMeta = null;

var results = PortfoliosHttpClient.ListPortfolios(
scope,
ParamUtils.GetDate(effectiveDate).ToDateTimeOffSet(), ParamUtils.GetDate(asAt).ToAsAtPredicate(),
getMeta: meta => portfolioMeta = meta);

This basically is a call that goes a fetches some data from the API(ListPortfolios), now i dont want that to go and do that when I’m testing, so I’ll mock this call to return/fetch some fake data. Problem is that there a Action<data> parameter which is responsible for setting a variable, prior to this call. I still need that variable to be set by this call but I’m mocking the entire function out, effectively removing the effects of this call entirely during test(even the Action paramter which would normally be run) – and replacing it with a mock call.

This unfortunately causes that parameter never to be evaluated and my variable(portfolioMeta) is never set – its always null. I’d like to set it to fake data too so it sets my variable and also the effects of the entire function (ListPortfolio) need to be mocked also. This is how it can be done:

 var mockPortfoliosClient = new Mock<IPortfoliosClient>(MockBehavior.Strict);
mockPortfoliosClient.Setup(x => x.GetPortfolio(
TestScope.Scope,
portfolio.Id.Code,
null,
AsAtPredicate.Latest,
It.IsAny<Action<IMeta>>()))
.Callback(new GetPortfoliocallSignature((scope, id, date, at, meta) => meta(TestData.Meta.DummyMeta)))
.Returns(ResponseHelper.CreatePortfolioResponse(portfolio, details, new PortfolioPropertiesDto{Properties = properties}));

Mock out the main ListPortfolios function, but get a call when you do, and then you can influence the mocking of that call, including setting the Action parameter to provide always mocked data via the call to:

=> meta(TestData.Meta.DummyMeta))

Which basically says, I'll provide a mocked version of the action parameter function, so anyone calling the overall function (our test) will have a my mock Action function run and in this case, that mock function will  provide mocked data to the outside world instead of not running at all. Its a fairly complicated scenario to explain...but I'll try.

This means that when the full mocked call is about to be mocked, it allows a call back to be called which I subscribe to, and this allows me to intercept the mocking out of the call and more importantly influence the invocation, including mocking the parameter with a version of Action<data>. I can do this and  always returns mock data and so always sets the variable to this mock data over and above mocking this entire API call.

You can do this with Moq's .Callback() function which as explained allows me to influence the call to Action<IMeta> meta and at the same time, then mock out the call to GetPortfolioFolio entirely and mock its response.

That’s pretty nifty.

Details
Category: Code
By Stuart Mathews
Stuart Mathews
18.Feb
18 February 2018
Last Updated: 16 March 2018
Hits: 3013

Latest progress with Excel JavaScript plugin

I made some progress on the excel add-in I'd been tasked with to develop. Up until this time, I'd been so absorbed in learning the main code base at at my new company that I had not time to spend with this. Actually, its a totally diffirent beast being a JavaScript centric project. I'd been assigned countless bugs and tasks for the main code base so all my time was spent doing that. 

During the last week of my illness, I basically was bed-ridden but I could code, so I did and I made some really great progress. I took a lot of my ideas from my Angular4 investment project and incorporated them into the plugin. i also started getting my head around the excel api. This code uses Promises which I' learned about a while back so i finally could use that. Promises lead to code that looks a like this(notice the multiple returns which return new Promises):

static SyncTable<T>(entities: T[], tableName: string, exclComplexTypes: boolean = true): OfficeExtension.IPromise<any> {
    // Get the named table.
    // Get the number of rows, if its diffirenct find difdirent rows from the bottom, create those entries
    let changes: TableChange<T>[] = [];
    //
    return Excel.run(context => {
      const currentWorksheet = context.workbook.worksheets.getActiveWorksheet();
      var table = currentWorksheet.tables.getItemOrNullObject(tableName);
      var bodyRange;
      return  context.sync().then(tableResult => {
        // now should or should not have table?
        if (table.isNullObject) {
          // empty table - normal entityToGrid
          console.log('Original row count is ' + entities.length);
          ExcelUtils.EntitiesToGrid(entities, tableName, exclComplexTypes);
        } else {
          // Existing table, get table and check the number of extra rows
          bodyRange = table.getDataBodyRange().load(['values', 'rowCount']); // cant use yet
          return context.sync().then(dataBodyResult => {
            console.log('RowCount is : ' + bodyRange.rowCount);
            if (bodyRange.rowCount > entities.length) {
              // get the added rows
              var diffCount = bodyRange.rowCount - entities.length;
              for (var i = 0; i < diffCount; i++) {
                var itemIndex = entities.length + i;
                var result = bodyRange.values[itemIndex];
                var obj: T = ReflectionUtils.toObject<T>(result, ReflectionUtils.getEntityProperties(entities[0], exclComplexTypes)[0].sort());
                var change: TableChange<T> = { change: 'a', value: <T>obj };
                changes.push(change);
                console.log('new row: ' + JSON.stringify(result));
              }
              //return new Promise<TableChange<T>[]>((resolve, reject) => { resolve(changes); });
              return context.sync();
            } else if (bodyRange.rowCount > entities.length) {
              // note which rows were removed and get them from the original entities
            }
            //return new Promise<TableChange<T>[]>((resolve, reject) => { resolve(changes); });
            return context.sync();
          });
        }
        
      })
    }).then(value => {
         return new OfficeExtension.Promise((resolve, reject) => resolve(changes));
      })
      .catch(error => {
      console.log(`Error: \({error}`);
      if (error instanceof OfficeExtension.Error) {
        console.log(`Debug info: \){JSON.stringify(error.debugInfo)}`);
      }
    });
    
    
  }
}

That code is turns a list JSON objects(entities) to an excel table and then if you edit that table and add extra rows, it detects the new rows and returns again those changes as a promise. I then act on those changes by 'subscribing' to that promise then creating them on the server. The returned promise contains the list of changed rows... its like madness.

 sync() {
    ExcelUtils.SyncTable<Trade>(this.trades, 'Trades', false).then((values) => {
      var syncResults = <TableChange<Trade>[]>values;
      if (syncResults && syncResults.length) {
        syncResults.forEach(tableChange => {
          var thinNewTrade: Trade = <Trade>{};
          var thinTradeRow = tableChange.value;
          for (let property in thinTradeRow) {
            (<any>thinNewTrade)[property] = (<any>thinTradeRow)[property];
          }
          this.zone.run(() => {
            this.readyTradeForUpsert(thinNewTrade);
          });
        });
      } else {
        this.message = 'No changes detected';

      }
    });
  }

 

 

 

Its all very rough but its got a lot of return context.sync().then...return context.sync()..then...etc. This is the chain of promises that I'm talking about that you need to do when working with promises.

Here is the code that will turn any JSON object into a table in excel:

static EntitiesToGrid<T>(entities: T[], tableName: string, exclComplexTypes: boolean = true): void {
    Excel.run(context => {
        const cols = ReflectionUtils.getEntityProperties<T>(entities[0], exclComplexTypes);
        cols[0].sort(); // sort columns alphabeticallly
        const rows = ReflectionUtils.getEntitiesPropertyValues<T>(entities, cols);

        // Turn camel case into display case ie. myNameIsEarl becomes My Name Is Earl
        cols[0] = cols[0].map(item => { return StringUtils1.Displayify(item); });
        const currentWorksheet = context.workbook.worksheets.getActiveWorksheet();

        // Work out the range we need based on the number of columns we have
        const range = 'A1:' + StringUtils1.xlsColumnDef[cols[0].length - 1] + '1';
        const expensesTable = currentWorksheet.tables.add(range, true /*hasHeaders*/);
        expensesTable.name = tableName;
        expensesTable.getHeaderRowRange().values = cols;
        expensesTable.rows.add(null /*add at the end*/, rows);
        expensesTable.getRange().format.autofitColumns();
        expensesTable.getRange().format.autofitRows();
        return context.sync();
      })
      .catch(error => {
        console.log(`Error: \({error}`);
        if (error instanceof OfficeExtension.Error) {
          console.log(`Debug info: \){JSON.stringify(error.debugInfo)}`);
        }
      });
  }

Most of this code can be found here. 

  • Excel Api
Details
Category: Blog
By Stuart Mathews
Stuart Mathews
18.Feb
18 February 2018
Last Updated: 16 March 2018
Hits: 2839

A bad case of Pharyngitis

Wow, I've been unwell these past couple of weeks. It all started with a little tickle in the throat that i noticed while I was in a coffee shop meeting up with an old work colleague. It turned out to me pretty damn nasty. It turned to be a really bad case of Pharyngitis.

I just had to go to the Doctor, and that usually means that I'm pretty unwell. I got up at around 5am and decided that I needed to go because I'd not slept in 4 days straight and I was so tired and weak. Coughing was incredibly uncomfortable and swallowing was a nightmare. This basically helped me stay up all night and every night. I didn't want to do anything - at all. I'm really supprised at how weak I was actually - my muscles were all jelly-like and I had a real bad case of not wanting swallow. It felt like my throat was super-glued together. So to cut a long story. I got some antibiotics and that's when things got real. 

I basically got knocked out for six...

I don't know if it was just timing or the stage of the illness but it may have just coincided with the antibiotics but it just went all to hell. I got a lot worse but it was no question that the antibiotics were nessessary as I certainly had an infection in the throat. It was just terrible. Everything got worse. There were times honestly that I questioned being single. A long-time realisation that often comes up when I an infection of some sort - the last one was an infected fingernail after clipping it to short with the nail clippers. That though is how back in the olden days, any infection would most often mean death, certainly the case in the middle ages and anytime i guess when there was no antibiotics. Thats a sobering thought. If I'd rewinded time back, if I was back in a time without antibiotics - I'd probably be dead. Woah.

There were times honestly that I questioned being single

 I remember waiting in the waiting room in the doctor's room and thinking how I must be very accurate about my symptoms so that i can correctly get fixed. So right there I wrote down all my symptoms into my phone. Here they are for fun:

  • Started experiancing my throat scratchy on Saturday (today was Monday).
  • Couldn't sleep Sunday and today.
  • Shivers. Hot and cold at the same time.
  • Not hungry. I've eaten rye bread with marmite though.
  • Muscles are weak.
  • I feel dissy.
  • Swallowing really hurts.
  • Incredible sore - not like any sore throat I've ever felt before.
  • Sometimes when I have a sudden boat of coughing, I feel nauseous.
  • Strepsils don't sooth the throat in anyway.
  • I had  headaches and I took day and night nurse and that helped but in no way helped the sort throat.
  • There is mucus and it was originally yellow but now it appears to be white and a lot less is coming up but still sore if not more so.

So that were my symptoms for Pharyngitis. This was during the firist week that I took off work. After a while things started to subside and once I'd finished my antibiotics and survived the rest of the week, the throat started to ease up. Thank god. The next week i think I was recovering from the antibiotics and now I had a normal cold/flu so I had to deal with that. Someone cut me a break!

The 2nd week actually allowed me to want actually use the computer. I decided to contribute to some work that I'd started before I joined my new company. I made some good progress on an Excel Javascript plugin I'd been working. I also spent some quite time reading the main code base. In hind sight, I think i really needed a break from work, I'm just unsure, this was how i wanted it!

I went to work on Friday, a little weak but compared to the preceeding weeks - I had superman's strength.

I've not and will not be pressuring my body moving forward - I feel me and my body have gotten a lot closer. So no gym or running until I know I'm totally better. The guy at work asked me if I'd be ready for the weekly work-run. I said probably not.  I told him about the story about the girl at my school who was sick (bronchitis I think) and played netball that afternoon and she died. No jokes, deadly serious. That had always struck a chord in me. Don't exercise or play sports when you're not well. So I'm not going to tempt fate.

Actually while I'm here - the current thing that is annoying me is how long it takes for my code to get merged into main...maybe its because I'm new and the reviewers keep me and my code at arms reach. Anyway.

Details
Category: Blog
By Stuart Mathews
Stuart Mathews
28.Jan
28 January 2018
Last Updated: 15 March 2018
Hits: 2806

The Bar

I’ve completed my first month in my new role in the City and recently met up with an ex-colleague over coffee and she asked me what its been like.

All the people are nice, most very welcoming and most have established the company from day one, all having worked with each other before. There are about 10 in the company (its a year old). So everyone seems to know everything and while this is great, I immediately realised that I wasn’t there for all of it - the design and architecture plans, the whys why nots– all that stuff is void for me. But here I am, I’ve arrived, albeit somewhat late and there is lots to figure out now…

I think that’s the best way I can describe it.

The other day, the director asked me in the kitchen – Did I get a fright, and the answer quite simply is no. I’m just trying to learn 1-years worth of work, decision-making, technology in 1 month. That's all. I didn;t say that but that's really what I think I meant at the time.

I’ve worked in Application Virtualization, which is pretty cool stuff, I’ve worked designing algorithms that crack open applications and extract DNA for analysis. I’ve written C code, C# code, Java code, Python code, Javascript, Typescript, I’ve thrown around pretty much every type of programming ball that I needed to. I’m not even talking about my youth (assembler, Pascal, C++ etc..) And boy has that been fun. So what’s with the new opportunity? Speaking frankly its one word. Finance.

Finance is something I’ve never been in, I’ve been in companies that have many banks and financial institutions as customers but its a sector I’ve never directly been been in - but hey, If its got to do with throwing coding balls around – I can do that. That's what I love doing.

So from a technical standpoint, I’m right where I’d like to be, I’m in a place where I’ve always aimed at being in, I’ve always been involved in software engineering pretty much all my life.

Recently before starting in this company, totally unrelated but seems on par with why I’m here now, is that I developed my own personal financial services API, complete with a website which would talk to it and record financial investments that I’d input into it. Many aspects of the solutions I’m familiar with. Previously, I’d started getting pretty interesting in investing, shares and stuff like that having bought into a few collective investments and various company shares - this is why I needed something to keep track of all my financial decisions; And well, this is what this Financial company is aiming towards, a unified API that people can use to store and input there financial information and decisions etc… Obviously there is a lot more to their API than mine and they’ve thought about this a lot longer than I have. Its quite a mature set of ideas. But its ideas that I don't know about completely yet.

But, and there always is a but – as I mentioned, this is nothing compared to knowing absolutely nothing about the story and design decisions that make up their company, technology and design. Sure, it looks similar to what I’ve done but I’m a layman in this domain, I’m not a trader or a financial exec or have been in the past – I’m a coder, a software craftsman, I day-dream about modelling thoughts with technology, not securities or instruments(which by the way I only very recently have come to understand). But having said that, I’m in good company – almost everyone in the room has been coding financial systems for years and years and years. While I was working on virtualizing applications across networks, and streaming pixels across the internet or analysing binary files for specific patterns – these guys were doing financial stuff.

Joining the party late means you’ve not got used to everyone or everything yet, and everyone is looking at you. The music’s playing but you’re not dancing.

So back to my coffee.  I told her something like this, “er, well its very interesting and the people are quite smart and well its just a big change for me.”. What I didn’t mention to her was that it was actually quite hard turning up to a party late. And that really has been the hard part of this whole experience.

Sometimes I’m my own worst enemy when it comes to expectations. I think this derives from my childhood where I often found myself not feeling ‘ready’ yet – I had a bar, that when I reach it,was when I was satisfied that I’m comfortable to go,  that's when I’m ready but often I had to ‘go’ before I could reach that bar. That's frustrating. So I’ve become accustomed to being comfortable with being uncomfortable…

So most of the time I ready myself on-the-fly  while doing whatever I was supposed to be ready for. It a good skill now that I come to think of it. I guess you could say I’ve had practise at hitting the ground running many times. I expect myself to manage and I always figure it out, grit my teeth and get on with it(bloody nose or otherwise). I’m a pretty robust worker, I strive for excellence in everything I do and this where my bar comes from I think and this was is the biggest hurdle for me since having joined : I want to know everything now. Its  such a frustrating position to be in and its quite difficult to be satisfied. I’m still not satisfied.

Its important for me to write down some achievements because otherwise, I think I’ve not had any and that gets me down a bit. So lets begin.

  1. Submitted my first code on day 1 using their development setups and processes.
  2. Started reading through Excel-Add(c#) in code and learnt how it worked (from the customer’s point of view)
  3. Added code changes to the plugin and http client (implementing some tasks in the backlog)
  4. Followed the server code up and down the stack trying to self-document it for myself – the path form incoming API request, from user to database(well not quite the database yet) but the layer that interfaces with it.
  5. Started learning a bit about functional programming having seen language.Ext Either<> and Option<> code.
  6. Started learning about the basic entities that make up the API and learning a bit how the queries are deal with on the server.
  7. Implemented basic Entity on the API and started becoming familiar with the programming patterns and design – but from a code ‘out’ perspective.
  8. Authenticated with Okta and then with the API from a new Excel Proof of concept add-in written in Java that they had me look at.
  9. Can list all portfolios is a scope in the POC add-in.
  10. Started to implement some new functions in the normal excel add-in(c#).

Here is the POC excel-plugin I wrote: Reflections on Excel JavaScript Add-in

I’ve been on two runs with some of the guys in the company here and here 

I can speak to most people but I’ve yet to become comfortable with any of them. I’ve watched the engineers get tipsy on a Friday night at after work drinks, have a merry time - and I laughed which was very entertaining.

Some other notable achievements are that I don't think I’m not judged at the way I dress or the beer that I drink(or don’t drink).

It does annoy me that I’m not yet proficient at many aspects yet but I think this isn’t all bad. I’ve been really quite this month, some might say, I’ve been almost too quite. Indeed someone thought that I’d gotten a fright at having just joined the company and seen the landscape. That’s not it.

I think its because I’ve been trying to reach my bar, I’m always trying to reach my bar…

You've got to Bounce: To travel hopefully is a better thing than to arrive

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
Load more...