I suppose seven days is enough time to tell if one has made a terrible decision in leaving one job and starting another, wouldn't you agree? Time enough to feel the familiar lick of flame as you realise you have inadvertently left the frying pan for the fire, or to realise that the grass, did in fact only look greener on this side.
There were some familiar 'failings', that could have started alarm bells ringing: The induction schedule being rewritten in favour of actual work, meetings being cancelled, people being unavailable. The odd thing is that, given the general atmosphere of the place and the level of activity of the people involved, at no time did it occur to me that any of the changes were actually harmful to me. I was always able to catch up with the people in question later, and they were always happy to discuss issues and answer questions. So I was happy, therefore, to forego the induction in its full sense and get stuck straight into 'being useful'. I got the distinct impression that this came as an enormous relief to the rest of the team. I know this because they actually told me.
I am a systemiser, and it was clear almost immediately that the systems and procedures here are inefficient at best, and might be entirely unworkable were it not for the tremendous effort being put in by the staff to keep everything running. I would like to be a part of seeing this fixed, and think they might let me. I know this because they actually agreed - the systems don't work well. Their honesty was refreshing, to say the least.
This was never the case in my previous job: Systems and procedures were everywhere and constantly updated without consultation and force fed with no concern about their wider effect, beyond their ability to inflate the progress statistics that drive results and ultimately, funding. In order to be both efficient and successful then, it pays to be cognitively biased: Try to consider too many things; too many people. Slow things down, and people at the top start to sweat. People at the bottom start looking for another job.
I watched a re-run of an horizon programme recently about the work of Daniel Kahneman, psychologist and Nobel prize winning economist. The basis of his work on cognitive bias forms an important part of the model of NT and Aspie cognitive processing in Peter Flowerdew's work. It was interesting to see people falling into the trap of making biased decisions by 'cutting out' data in everyday situations in favour of faster processing, even to the point of 'ignoring' a plainly visible assault in one experiment. This is Kahneman's 'System 1' (Fast thinking): That wonderful, flexible neural network that filters out all the 'unimportant' stuff and allows instant, intuitive connections to be made in their place, all at lightning speed, and mostly without your knowledge: The network that is so underdeveloped and underused in Aspies... Kahneman's genius was to highlight the triggers, influences and repercussions of this type of decision making.
Sadly, the programme concentrated on System 1 and never really went into any detail about 'System 2', (the system I find myself limited to for much of the time). This slow, logical, analytical, and much more reliable way of thinking is usually reserved for difficult and involved maths problems and the like in neurotypical people. Aspies tend to rely much more on this system, and it is often developed to a remarkable degree. If everyone had to use this system for all their thinking, they would certainly need things to slow down, as Aspies do, but would certainly make fewer mistakes.
As I watched the programme, my voice was raised at the end of every experiment: "An Aspie would have made an objective decision!" "An Aspie wouldn't have made that judgement!" "An Aspie wouldn't have been swayed by popular opinion or simply by 'apparent fit'!" "An Aspie would have looked at all the facts!"
So, the question is: What is better? To rush decisions and finish on time, but with a good chance of having got something important wrong, or to take things more slowly, consider carefully, and finish later having got everything right?
I had hoped that Economists had learned something from the financial woes of the last 15 years. Kahneman told them that people take risks - and the greater the potential loss, the greater the risk they are prepared to take. Perhaps it's time to consider putting System 2 to use a little more. (I know some Aspies who could help with that.)
A beautiful map of necessary ignorance? What would the Aspie version look like?
There were some familiar 'failings', that could have started alarm bells ringing: The induction schedule being rewritten in favour of actual work, meetings being cancelled, people being unavailable. The odd thing is that, given the general atmosphere of the place and the level of activity of the people involved, at no time did it occur to me that any of the changes were actually harmful to me. I was always able to catch up with the people in question later, and they were always happy to discuss issues and answer questions. So I was happy, therefore, to forego the induction in its full sense and get stuck straight into 'being useful'. I got the distinct impression that this came as an enormous relief to the rest of the team. I know this because they actually told me.
I am a systemiser, and it was clear almost immediately that the systems and procedures here are inefficient at best, and might be entirely unworkable were it not for the tremendous effort being put in by the staff to keep everything running. I would like to be a part of seeing this fixed, and think they might let me. I know this because they actually agreed - the systems don't work well. Their honesty was refreshing, to say the least.
This was never the case in my previous job: Systems and procedures were everywhere and constantly updated without consultation and force fed with no concern about their wider effect, beyond their ability to inflate the progress statistics that drive results and ultimately, funding. In order to be both efficient and successful then, it pays to be cognitively biased: Try to consider too many things; too many people. Slow things down, and people at the top start to sweat. People at the bottom start looking for another job.
I watched a re-run of an horizon programme recently about the work of Daniel Kahneman, psychologist and Nobel prize winning economist. The basis of his work on cognitive bias forms an important part of the model of NT and Aspie cognitive processing in Peter Flowerdew's work. It was interesting to see people falling into the trap of making biased decisions by 'cutting out' data in everyday situations in favour of faster processing, even to the point of 'ignoring' a plainly visible assault in one experiment. This is Kahneman's 'System 1' (Fast thinking): That wonderful, flexible neural network that filters out all the 'unimportant' stuff and allows instant, intuitive connections to be made in their place, all at lightning speed, and mostly without your knowledge: The network that is so underdeveloped and underused in Aspies... Kahneman's genius was to highlight the triggers, influences and repercussions of this type of decision making.
Sadly, the programme concentrated on System 1 and never really went into any detail about 'System 2', (the system I find myself limited to for much of the time). This slow, logical, analytical, and much more reliable way of thinking is usually reserved for difficult and involved maths problems and the like in neurotypical people. Aspies tend to rely much more on this system, and it is often developed to a remarkable degree. If everyone had to use this system for all their thinking, they would certainly need things to slow down, as Aspies do, but would certainly make fewer mistakes.
As I watched the programme, my voice was raised at the end of every experiment: "An Aspie would have made an objective decision!" "An Aspie wouldn't have made that judgement!" "An Aspie wouldn't have been swayed by popular opinion or simply by 'apparent fit'!" "An Aspie would have looked at all the facts!"
So, the question is: What is better? To rush decisions and finish on time, but with a good chance of having got something important wrong, or to take things more slowly, consider carefully, and finish later having got everything right?
I had hoped that Economists had learned something from the financial woes of the last 15 years. Kahneman told them that people take risks - and the greater the potential loss, the greater the risk they are prepared to take. Perhaps it's time to consider putting System 2 to use a little more. (I know some Aspies who could help with that.)
A beautiful map of necessary ignorance? What would the Aspie version look like?