- Details
- Category: Blog
- By Stuart Mathews
- Hits: 2688
Since The Law, Being mental and Data Science, the last two weeks have been fairly muted in some respects and quite substantial in others. I've been wanting to complete an assignment for my Continuous Professional Development (CPD) module and it's been weighing heavily on me, though I've been avoiding it. The subject matter is concerning Reflective thinking, The nature of understanding (knowing) and Concept maps among other things. I think I was a bit ambitious with this one.
Last weekend I didn't do anything related to it whatsoever, I hardly even made eye contact with my computer. Instead, I magically found inspiration to go to the gym. So much so that I went on Saturday morning, Saturday evening and Sunday morning. Funny that. The best way to procrastinate, in my opinion, is to go for a run and do some exercise. At least it's healthy.
In prior weeks, I'd been heavily focused on another assignment for my Digital forensics module which is currently going through English law right now, which might appear boring is quite interesting and I'm enjoying it.
In Tuesday's C programming class, we covered pointers and while I knew what was going on, other peoples heads blew up(Except for this one guy who seemed to get it).
I also submitted my end project which I cobbled together to create very simple noughts and crosses application that two people can play. It uses multidimensional arrays to represent the board and I've tested the puzzle solving algorithm in a 3x3 board and a 4x4 board although we only needed to code up a 3x3 board. It's not pretty but it works (a bit like me really...) Also, I didn't need to use multidimensional arrays but this has always been one my weaker subjects so I wanted some practice and its related to pointers so that's OK. The brief for the project is here
The class is drawing to a close now and I've only got 1, perhaps 2 classes left and I'm going to miss it. One of the student that I'm in contact sent me and email saying that she is eyeing out the Python course but I'm not really interested in re-learning python or maybe it is that it cost me another £500.
An interesting tidbit that I came across in my digital forensics material was a mention on how the C programming was not designed with the current mindset in the security engineering, which is of usability i.e making it easier to be safer. As master Yoda would say, "that it does not".
But then again, if you're programming in C, it likely that you're alone in a dark place and the effects of your C program will only kill you and your immediate fellow underlings down there. I miss the dark days...
Anyway, during the week, on the train every other day(basically my non-run days) I've been slowly piecing together my CPD assignment and today I completed the majority of it. That said it's not complete, it still needs proofreading, referencing and condensing - word limits are really restricting these days. I'm trying to manage my commitment and sanity.
Today, I switched back to the next sections in Law discussing the Computer Misuse Act of 1990 (CMA) and that that was how the day transpired really. Although I didn't go out except to buy some snacks or go to the gym, as I mentioned, I'm rather enjoying the law subject we're on. We've got another few weeks left so perhaps by the end of that, I'll be less enthusiastic?
One of the most revealing aspects of digital forensics has been focusing on the law and while doing my Software Engineering module, which I finished a few months back, I also found the sections around intellectual property and copyright and reproduction law quite fascinating. One of the great things about understanding more about the law is understanding how not to break it, and how it generally works. Seemingly small things like modifying an application's binary contents, circumventing a security mechanism or obtaining access to machines just because you can now hold some really intense penalties and in some cases like breaching section 3, you can be imprisoned for up to 10 years. Btw, section 3 under the CMA is about manufacturing viruses and impairing computer operation in general. An interesting read is cases where people have been convicted of related computer crimes such as this interesting website: http://www.computerevidence.co.uk/Cases/CMA.htm.
All this has rather muted my recent efforts to get an animated sprite up and running through my game development project and even my reading on the matter has somewhat taken a backseat. But that's Ok, everything has its priority and place and I'll get straight back into it when the other priorities start dying down somewhat (CPD and DF assignments for one).
I've managed to stay disease free, meaning I've not caught a cold whilst sticking to my running regime of about 25km a week. This is good.
And if you're wondering what Bran Flakes has to do with the title of this post - well let it be said that one should never go for a run in the morning fuelled up with All Bran Flakes. I did and I nearly did a Mikael Ekvall on route - I didn't but never, never tempt fate. That being said, had a good run, setting a new threshold heart rate of 154 which I gather from this is the longest ability to carry a average heart rate for the route/distance I run. So I seem to be able to average out at 154bpm over 8.5km.
- Details
- Category: Blog
- By Stuart Mathews
- Hits: 2494
Since Games, STL, Digital Forensics and a Fedora, I've been pushing forward, however, I'm pretty tired this weekend. I skipped my run into the office on Friday because I suspected that I was coming down with something and thought better of it.
I went to the gym yesterday and I felt a bit weak. The run in was a bit tough because the sun was blazing down, though I think that unlike most, I'm ok with running in the heat, still, it was hot under the sun.
I worked on my back and I had to pace myself because there were times that It was pretty tough going. I think because I'm not as regular as I used to be, I've not developed the resistance to hard training so when I do it, its a tough going. But I just need to adapt to my new routine.
I had to slow down during my bench rows, as I strained a little near the end of my sets and my lat pulldown wasn't as heavy as it normally is, though I kept my form which is important and this morning I can feel the areas of my back. The run back that day was also a little tough and I suffered somewhat but I'm used to that - so I continued through it. You got to be a little bit mental. maybe a lot.
Today I woke up and decided to do it all again, this time the run in wasn't that hot or tough but I can certainly feel my weight inhibit me. I'm trying to be in shape so that my running isn't so tough and I think its working, though perhaps just being tired made me feel sluggish.
I decided to do shoulders today, which is notable because I've strayed far from shoulders recently and I'm very happy with the results. I was able to articulate my shoulder through a full range of motion. It wasn't heavy but that's ok, I need to ensure my shoulder is strong and fully recovered. I guess I'll really determine the outcome of today from tomorrow. Again, much like yesterday I focused on full ranged motion and left my ego at the door. I ran back and felt pretty OK - a lot easier than yesterday.
After yesterday's session, I was a bit pooped and fell asleep after my shower - propped up on my bed headboard. That was a good little break because It allowed me to regain some energy and motivation.
I spend the rest of the day studying Law. Common and Civil Law and how the courts and the law are structured. It was pretty interesting. I also did some planning for my next assignment which is about Continuous Professional Development which I deferred until this semester, which gave me real problems last time so I'm coming into this with new eyes.
Had my weekly lecture on C programming on Tuesday which gets me home at around 22:00 and in bed around 23:00 which is a bit painful but necessary. I helped a colleague to understand the concepts inherent to C which are quite challenging at times. I didn't realise how loud we were talking and was asked by this one guy in the room, very loudly, to 'Stop that, its really annoying!". I was actually quite surprised, shocked really. The dude is also finding the work tough so I guess he was just really annoyed at not being able to get 'it'. We were going over functions and we've yet to reach arrays and pointers, which should blow his mind.
Anyway, made me realise how not used to being in a classroom environment I am.
Though I think I really helped my colleague. Boo-hoo to him. I said sorry and left it at that. Moving on.
Apart from this, I've been recently working with a colleague on a data science platform, well just the beginnings of it. It is pretty cool because its an application of my dataflowpipline project I recently started. The key requirement of the pipeline for a data transformation exercise are to put some data through the pipeline, run some anonymizers on it, and then run some feature extraction function on it.
I sat with him and we coded it up together based on his experience at Cern. I had to go away and re-code it because it only became clear to me that tone last aspect of the discussion (around building a delegate that builds a pipeline for just one item should be created). Sometimes my brain needs to sleep on it to work it out. Anyway, I worked it out(see below) and it was quite interesting and fun doing it(maybe I should get out more).
The key was to create a pipeline builder that would create a pipeline could take in any single item of data and anonymize it and then extract features from it, both these processes are all represented by custom Processes() that represent those stages of the pipeline. Each process is represented as a stage and each stage is represented as a function ie Func<T,T>. The goal, as mentioned previously, was ultimately to represent a delegate that would take a single item of data and construct a dedicated pipeline configured with the anonymisers stages and the feature extraction stages built in and then put it through the pipeline. Then the full transformation would use this generic typed delegate on all the items.
I came up with this prototype, which transforms a list of People objects in this case, but in theory, any objects can be used and the appropriate anonymizers and features can just be defined and passed in when creating the pipeline and it will bake those into the pipeline.
This is is the way you'd create a pipeline for a list of people. Just add Anonymizers and feature extractors:
// This is a generic builder where we can use any type of data
// and transform it into a array of, in this case, integers
// The features take a person and turn it into an inteter
// if there are 3 features, then 3 integers will result int[3]
// for one person passed into the pipeline
var builder = new Builder<Person, int>();
// This will take all the anonimizers, features and the data
// and transform all the items into items of feature results
var result2 = builder.Transform(items: listOfPersons2, anons: anons, features: features);
The Anonymizers are just a list of functions that transform the data somehow and the features extract features somehow and the result is a matrix of this transformation of all the items into this array of extracted features which then can be minded or run through an algorithm for analysis-a-la-data-science.
The anonymizers and features are defined here:
// Here are two anonymizers
// We'll represent them as stages in the pipeline
Func<Person, Person> Anon1 = (person) =>
{
person.Name = "Boop";
return person;
};
Func<Person, Person> Anon2= (person) =>
{
person.Age = person.Age + 5;
return person;
};
// These are the feature extractors
Func<Person, int> featureEx1= (person) => person.Age;
Func<Person, int> featureEx2 = (person) => person.Name.Length;
// turn the anonimizers and features into arrays of funcs
var features = new[] { featureEx1, featureEx2};
var anons = new[] { Anon1, Anon2 };
Actually providing multiple stages (anonymizers) in the dataflow pipeline library wasn't possible in v0.02 so I had to manually code a loop and this prompted me to release a new version of the library to do this.
It's 4 days old and has 22 unlucky souls using it.
Anyway, I'll put the full source code at the end of the post.
I popped into my little Italian dinner, located on the ground floor of the local shopping centre after the gym and had an Italian bean salad with Mediterranean veg, chicken, mushrooms, sun-dried tomatoes and avocado. Then I spent most of my day following that reading the Association Of Chief Police officer's Good practise guide to digital evidence, which can be found here: https://www.digital-detective.net/digital-forensics-documents/ACPO_Good_Practice_Guide_for_Digital_Evidence_v5.pdf which has been on my todo list, since last weekend. Quite a long read but I'm better informed now because of it.
So on the whole, another productive two days.
As is life.
My data aquisition and trigger POC source code as promised:
sing System;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Pipeline;
using static Pipeline.Pipeline;
namespace daq
{
// This is an example of the data we'll pass through the pipeline
// in order to transform it
class Person
{
public Person(string name, int age)
{
Name = name;
Age = age;
}
public string Name { get; set; }
public int Age { get; set; }
}
class Program
{
static void Main(string[] args)
{
// Some copies of the input data
Person input1 = new Person(name :"Stuart", age : 31);
Person input2 = new Person(name: "Stuart", age: 31);
// Here are two anonymizers
// We'll represent them as stages in the pipeline
Func<Person, Person> Anon1 = (person) =>
{
person.Name = "Boop";
return person;
};
Func<Person, Person> Anon2= (person) =>
{
person.Age = person.Age + 5;
return person;
};
// These are the feature extractors
Func<Person, int> featureEx1= (person) => person.Age;
Func<Person, int> featureEx2 = (person) => person.Name.Length;
// turn the anonimizers and features into arrays of funcs
var features = new[] { featureEx1, featureEx2};
var anons = new[] { Anon1, Anon2 };
// This will run all the features on a single Person
// This is specifically desinged to deal with Person objects
// So its not generic but we'll fix that later with a builder
Func<Person, Func<Person, int>[], int[]> featureVectorizationFunc = (person, featuresLst)
=> featuresLst.Select(fn=> fn(person)).ToArray();
// Prepare copies of the input as a series of Persons instead
// of just processing one person we want to process many
var listOfPersons = new Person[] {new Person("Jack", 29), new Person("stuart", 31), new Person("John", 45)};
var listOfPersons2 = new Person[] { new Person("Jack", 29), new Person("stuart", 31), new Person("John", 45) };
var listOfPersons3 = new Person[] { new Person("Jack", 29), new Person("stuart", 31), new Person("John", 45) };
// Examples of the data flow pipeline transformation process
// Pipe line of applying anonymiser in multi elements input before transforming result
// with custom function featureVectorization
// While this works, you need to provide each Anonimizer
var result1 = listOfPersons.Select(p => StartPipeline(() => p)
.Process(Anon1)
.Process(Anon2)
.ProcessAndTransform((person) => featureVectorizationFunc(person, features)).Result);
// This is a generic builder where we can use any type of data
// and transform it into a array of, in this case, integers
// The features take a person and turn it into an inteter
// if there are 3 features, then 3 integers will result int[3]
// for one person passed into the pipeline
var builder = new Builder<Person, int>();
// This will take all the anonimizers, features and the data
// and transform all the items into items of feature results
var result2 = builder.Transform(items: listOfPersons2, anons: anons, features: features);
// Example using the delegate which can process just one Item
var result21 = builder.SendThroughPipeline(input1, anons, features);
// Pipe line of applying anonymiser on one element of input before transforming result with custom function featureVectorization
int[] result3 = StartPipeline(() => input2)
.Process(Anon1)
.Process(Anon2)
.ProcessAndTransform((person) => featureVectorizationFunc(person, features)).Result;
}
}
/// <summary>
/// Turn input into a feature set, applying anonyizers on entry.
/// Fully Generic
/// </summary>
/// <typeparam name="T">Type of the items</typeparam>
/// <typeparam name="R">Type of the result matrix</typeparam>
public class Builder<T,R> where T : class
{
internal static Func<T, Func<T, R>[], R[]> featureVectorizationFunc = (person, featuresLst)
=> featuresLst.Select(fn => fn(person)).ToArray();
/// <summary>
/// function delegate that'd then be called with each object
/// </summary>
public Func<T, Func<T, T>[], Func<T, R>[], R[]> SendThroughPipeline = (T eachItem, Func<T, T>[] anons, Func<T, R>[] features)
=> StartPipeline(() => eachItem)
// Run the Anonimizers
.Process(i =>
{
// Really if Process could actually do this given an Func<IEnumerable<T>> of anonimizers that'd be much better than this
T prevResult = null;
for (int j = 0; j <anons.Length; j++)
{
prevResult = prevResult == null ? anons[j](i) : anons[j](prevResult);
}
return prevResult;
})
// Split item into feature results and put into matrix
.ProcessAndTransform((person) => featureVectorizationFunc(person, features)).Result;
public R[][] Transform(T[] items, Func<T, T>[] anons, Func<T, R>[] features)
=> items.Select(eachItem => SendThroughPipeline(eachItem, anons, features)).ToArray();
}
}
- Details
- Category: Blog
- By Stuart Mathews
- Hits: 3310
Since C, 64, Triple Black, Resource Management and a Game Prototype, I’ve been working on my game prototype these past few couple of days which now has morphed in its architecture as I’ve been learning about new concepts.
This weekend I’ve learnt about Scene Management. Scene management is all about targeting the subset of resources in your game that belong to the current scene. So, this means when a new scene is started, the resources associated with it will need to be loaded into memory. Those not part of the scene need to be unloaded etc. So, there is a bit of interplay between the resource manager and the resource manager.
I didn’t have a scene manager in my prototype and its introduction into my code base meant that I needed to change how my rendering processes work. My game prototype is a bit of a mug, it’s a scratchpad of different sources of information. This is great because I’m picking and choosing things in some places that make sense to me and using those ideas. The downside is that not everything is the same from say a particular book I read. For example, I’m using the game loop from one book (Game Algorithms), the scene management from another (Thorn) and the two sources differ in how they deal with audio for example. I’m now considering implementing the Actor or game object infrastructure using a 3rd book (from which I've already used the event management ideas from). It’s fun.
Speaking of fun, I had a good time setting up my Event Manager(which I introduced in Retro sounds, Event Manager and some running) to link up the main game to the Scene Manager to the Graphics Manager. I'm finding the event management the most rewarding part of coding up the prototype. One of the great things about it is that it decouples many subsystems from each other. For example, I have components like the actors subscribe to DoLogicEvent, PositionChangedEvent and the Resource Manager subscribe to SceneChangedEvent. This is pretty cool. You just need to to subscribe to your event of choice and when it happens. Being able to use this more frequently is proving how useful the design pattern is.
I’m also glad that I’m being a lot more conversational in my C++ and the practice using STL quite rewarding. The new features I’ve mentioned previously like lambdas and implicit variable types is nice.
I also spent most of Sunday writing up an assignment for my Digital Forensics course. This so far has entailed understanding the incident life-cycle, insights into contemporaneous notes and basic introductions into Computer, Mobile, Image and Network forensics, among others, and what they can tell us. Quite interesting.
I must say that while I enjoyed it, the word limits are quite restrictive. I am glad that I didn’t kill myself trying to get it done though - I had started it some weeks before. Recently I’m finding that my more exotic choices i.e. modules that I’m not already familiar with i.e. non-Computer Science related, like Forensics or Psychology aren’t that easy to predict one’s efforts. I remember this one time working 2 weekends for a meager result which was both time-consuming and disappointing. That specific module was Psychology, so I’m reserving judgement.
I’ve been going to my weekly lectures (that’s really what they are – he just talks) on the C programming language and I spent the entire session coding up the course final project. It’s a noughts and crosses game. I’m not finished with it and I’m surprised at how I didn’t finish it in the class. I’m obviously not as good as I thought I was! That said, I’ve solved it now but its not perfect which is irritating. So, I’m gaining more from this course that perhaps I thought I would.
My issues first arose from the fact that I decided to not use the concepts that I’ve yet to be introduced in the course but then I go and base the solution on a multi-dimensional array-based board – something we’ve not been taught yet. So, I’m doomed from the outset – half of it is restrictive by the above and the other is using un-restrictive ideas (what a mess?) and I’m hell-bent on using multi-dimensional arrays. I don’t mind it being tough.
I’m stuck being unable at the moment to figure out a generic algorithm to solve diagonals irrespective of the dimension of the multi-dimensional array – something that should be easy to solve but my brain hasn’t been paying much attention to sanity lately. The code is here.
I also found myself in the gym yesterday (Bank holiday) as I couldn’t go on Sunday(assignment) or Saturday (lazy and meeting friend in London). So, I went out on the trot and kicked out a good session. I’m not as strong as I was but I’m OK with that. My shoulder is really improving and it feels like I could work it but I’m not quite ready yet. I took care to eat correctly because I’ve not been sleeping well recently and that’s when my body starts to fall ill. So far I’m kept it in check, eat well, rested well and that’s a good thing.
During the week, I ordered 3 new books which look fun:
- Game Architecture and design
- This one is actually more about the game design process than architecture. That said there is a single chapter on architecture. I started reading the theory about game design on the train back on the day I got it. It was actually really fascinating. So I’ll get back to it soon.
- An introduction to 3D game programming
- I’m not yet started reading this however I did thumb through it while eating a meal. It basically DirectX8 using C# so It looks pretty much a precursor to the more prevalent and more powerful Unity framework. I’m interested in seeing how the basic 3D graphics principles are put into practical application – irrespective of the language/age of the technology.
- Game programming all in one
- This is a wildcard. It huge however but from 2008 so I’m sure there are some useful stuff in it, notwithstanding the age. That said, I’m told that in gaming things don’t move that quickly but I’ve heard other sources say it moves faster! I’m after the concepts, however – the ones that survive the test of time.
On my investment tracker project, I’m working to improve my recently introduced feature – transactions because there are bugs that prevent it from being usable. I’ve started recording them on my Github page. This is a development work in progress so there are still some rough edges particularly error handling and site access (login and registration) is not finished. That said I’m quite happy with the overall progress of the project.
I’ve been looking into Caliburn.Micro recently to help me quickly build a WPF UI for it but this is still in the starting blocks – early phase.
I also upgraded by Fedora box from 29 to 30 which broke my tablet. For some unknown reason (something that seems to pervade this bloody tablet) it didn’t want to log into Gnome. The new version of gnome is special and one of the reasons I wanted to upgrade so that sucked. Up until this time I’d just left it – I’ve got bigger fish to fry. I don’t have a keyboard (USB) so interacting with it at in emergency rescue mode is a non-starter. So I decided to got out and buy a brand new keyboard for the tablet (sounds silly) but it enabled me to finally fix the issue.
Through some trial and error, I found that XOrg could not find libgtk-3.0-dev which sounds to me like the upgrade process made a boo-boo so I uninstalled gdm because that is what probably needed it and reinstalled it. That worked. So, I’ve still got all my content and don’t have to start again. Why go through all this pain and torment? Good question, I guess I’m a masochist. Phew!
So, with the extra keyboard around I decided to use it more routinely and plugged into my laptop and used it to code and it was so much nicer – I loved it! So maybe I’ll continue having a slightly odd-looking setup: A laptop and a tablet with a full-sized keyboard. Whatever works I say.
Still running – always running.
More Articles …
- C, 64, Triple Black, Resource Management and a Game Prototype
- Easter, Heroku, Postmodernism, DataFlow, Math and Software Engineering
- Postmodern Software: Embracing complexity
- Mountain running and Zebras
- Another day in paradise
- Wet rain
- A rainy day
- Drakensberg boys’ choir
- A Scenic African run with sensible shoes
- African party birds
Page 25 of 182