Over the last few years we noticed how are telephone costs had gone up line rental in particular; this is true whether it is BT or Virgin. So we looked for alternatives and found Vonage, for a one off fee for a box that plugs into out Internet router and a request to transfer the BT line to Vonage, we removed our whole reliance on the BT line. The telephone plugs into the box and hey presto our calls are now routed via the Internet to us.
The cost is only £6 a month with about an hours outgoing calls. As most of our calls are incoming it is quite a reduction in costs.
-
IP Telephone Systems
-
Welcome
Welcome to Wellis Technology’s new web site.
In line with our ethos on web site design, we are now using the WordPress system to create our web sites. In the past we have written our editors to allow clients to manage the content of their sites. We still support our software but now we are looking to use WordPress to do this.
We can then iether use an off the shelf WordPress theme to show your content or we can write custom pages to display your content.
-
The next generation of computing experts
Here is a link to the next big thing in computing, http://www.raspberrypi.org/ this is a little kit type computer is reminisent of what we used to play with 30+ years ago and is aimed at getting people to play again and learn what a computer really is
-
VSJ – December 2008 – Sounding Board
Council member John Ellis, FIAP has some thoughts on Code Optimisation and Re-Factoring
I’ve recently been working on adding new functionality and rules to a large ISA system. As part of these changes I was amending some Visual Basic code when I came across something like this:
IF p=q THEN
‘ Do X
‘ Do Y
‘ Do Z
END IF
Obviously the code does nothing and the IF block is just a waste of code and, more importantly, execution time.
Quite often (and I’m guilty of this, too) we see a block of code that is to be removed and comment it out. There’s nothing really wrong with that, but over the lifetime of a system it can lead to 50-60% of the code being removed this way. Quite apart from the potential for inefficient execution, it can leave the wood of real comments lost in the trees of ‘temporary’ edits.
I recommend either that code is simply removed (the source code control tools will keep an old copy anyway) or that comment-edits include a date. That way it’s easier for subsequent editors to notice (and remove) commented code if appropriate.
It does remind me, though, that developers sometimes need to step back and spend time really thinking about the code they change. It’s not just about removing extraneous code; sometimes it needs re-factoring.
For instance, a few years ago I was product-managing some code for an insurance quotation system. Rather than just looking at the code on screen, I printed it out so I could work on it while out of the office. Looking at the listing, I realised that one particular section of code seemed to be repeating itself, thus:
IF Insurer = 1
IF Thatched
Function1
Function2
Function3
END IF
END IF
IF Insurer = 2
IF Thatched
Function1
Function2
Function3
END IF
END IF
IF Insurer = 3
IF Thatched
Function1
Function2
Function3
IF LOCATION = “NORTH”
Function4
END IF
END IF
END IF
IF Insurer = 4
IF Thatched
Function1
Function2
Function3
END IF
END IF
…
…
There were about seventy insurers in total, some of them having five or six quotes each! Unsurprisingly, the system was taking around two minutes to execute them all. Having reviewed the code (twenty-plus pages of printout) I was able to re-factor it to:
IF Thatched
Function1
Function2
Function3
IF Insurer = 3
IF LOCATION = “NORTH”
Function4
END IF
END IF
END IF
Simple enough; but the code had been in place for around eight years and every new insurer or product just got a new block of code because that was the way it had always been done. The alteration removed twenty-odd pages of code and the execution time was reduced to 32 seconds. This obviously had a major system impact. Sometimes the re-factoring process can be dramatic as this, but the most noticeable thing from a developer’s point of view is that the code is easier to read, maintain and debug.
System managers and users are always pressing to have system changes implemented yesterday, but if we rush them in, we can end up with slow, difficult to maintain code that becomes more unwieldy over time. Gradually, the time spent making changes and debugging the code becomes uneconomic. The systems support developer ceases to be effective and a complete system or sub-system rewrite is necessary, taking months of analysis and development.
There are systems and methodologies to do this re-factoring, but sometimes the simplest and cheapest way is for developers just to stop and think about it occasionally, on the fly, so to speak.
You can contact John at john@74stonelane.co.uk
[Something you’d like to get off your chest? Email me (Robin Jones) at eo@iap.org.uk.] -
VSJ – December 2008 – Sounding Board
Council member John Ellis, FIAP has some thoughts on Code Optimisation and Re-Factoring
I’ve recently been working on adding new functionality and rules to a large ISA system. As part of these changes I was amending some Visual Basic code when I came across something like this:
IF p=q THEN
‘ Do X
‘ Do Y
‘ Do Z
END IF
Obviously the code does nothing and the IF block is just a waste of code and, more importantly, execution time.
Quite often (and I’m guilty of this, too) we see a block of code that is to be removed and comment it out. There’s nothing really wrong with that, but over the lifetime of a system it can lead to 50-60% of the code being removed this way. Quite apart from the potential for inefficient execution, it can leave the wood of real comments lost in the trees of ‘temporary’ edits.
I recommend either that code is simply removed (the source code control tools will keep an old copy anyway) or that comment-edits include a date. That way it’s easier for subsequent editors to notice (and remove) commented code if appropriate.
It does remind me, though, that developers sometimes need to step back and spend time really thinking about the code they change. It’s not just about removing extraneous code; sometimes it needs re-factoring.
For instance, a few years ago I was product-managing some code for an insurance quotation system. Rather than just looking at the code on screen, I printed it out so I could work on it while out of the office. Looking at the listing, I realised that one particular section of code seemed to be repeating itself, thus:
IF Insurer = 1
IF Thatched
Function1
Function2
Function3
END IF
END IF
IF Insurer = 2
IF Thatched
Function1
Function2
Function3
END IF
END IF
IF Insurer = 3
IF Thatched
Function1
Function2
Function3
IF LOCATION = “NORTH”
Function4
END IF
END IF
END IF
IF Insurer = 4
IF Thatched
Function1
Function2
Function3
END IF
END IF
…
…
There were about seventy insurers in total, some of them having five or six quotes each! Unsurprisingly, the system was taking around two minutes to execute them all. Having reviewed the code (twenty-plus pages of printout) I was able to re-factor it to:
IF Thatched
Function1
Function2
Function3
IF Insurer = 3
IF LOCATION = “NORTH”
Function4
END IF
END IF
END IF
Simple enough; but the code had been in place for around eight years and every new insurer or product just got a new block of code because that was the way it had always been done. The alteration removed twenty-odd pages of code and the execution time was reduced to 32 seconds. This obviously had a major system impact. Sometimes the re-factoring process can be dramatic as this, but the most noticeable thing from a developer’s point of view is that the code is easier to read, maintain and debug.
System managers and users are always pressing to have system changes implemented yesterday, but if we rush them in, we can end up with slow, difficult to maintain code that becomes more unwieldy over time. Gradually, the time spent making changes and debugging the code becomes uneconomic. The systems support developer ceases to be effective and a complete system or sub-system rewrite is necessary, taking months of analysis and development.
There are systems and methodologies to do this re-factoring, but sometimes the simplest and cheapest way is for developers just to stop and think about it occasionally, on the fly, so to speak.
You can contact John at john@74stonelane.co.uk
[Something you’d like to get off your chest? Email me (Robin Jones) at eo@iap.org.uk.] -
VSJ – December 2008 – Sounding Board
Council member John Ellis, FIAP has some thoughts on Code Optimisation and Re-Factoring
I’ve recently been working on adding new functionality and rules to a large ISA system. As part of these changes I was amending some Visual Basic code when I came across something like this:
IF p=q THEN
‘ Do X
‘ Do Y
‘ Do Z
END IF
Obviously the code does nothing and the IF block is just a waste of code and, more importantly, execution time.
Quite often (and I’m guilty of this, too) we see a block of code that is to be removed and comment it out. There’s nothing really wrong with that, but over the lifetime of a system it can lead to 50-60% of the code being removed this way. Quite apart from the potential for inefficient execution, it can leave the wood of real comments lost in the trees of ‘temporary’ edits.
I recommend either that code is simply removed (the source code control tools will keep an old copy anyway) or that comment-edits include a date. That way it’s easier for subsequent editors to notice (and remove) commented code if appropriate.
It does remind me, though, that developers sometimes need to step back and spend time really thinking about the code they change. It’s not just about removing extraneous code; sometimes it needs re-factoring.
For instance, a few years ago I was product-managing some code for an insurance quotation system. Rather than just looking at the code on screen, I printed it out so I could work on it while out of the office. Looking at the listing, I realised that one particular section of code seemed to be repeating itself, thus:
IF Insurer = 1
IF Thatched
Function1
Function2
Function3
END IF
END IF
IF Insurer = 2
IF Thatched
Function1
Function2
Function3
END IF
END IF
IF Insurer = 3
IF Thatched
Function1
Function2
Function3
IF LOCATION = “NORTH”
Function4
END IF
END IF
END IF
IF Insurer = 4
IF Thatched
Function1
Function2
Function3
END IF
END IF
…
…
There were about seventy insurers in total, some of them having five or six quotes each! Unsurprisingly, the system was taking around two minutes to execute them all. Having reviewed the code (twenty-plus pages of printout) I was able to re-factor it to:
IF Thatched
Function1
Function2
Function3
IF Insurer = 3
IF LOCATION = “NORTH”
Function4
END IF
END IF
END IF
Simple enough; but the code had been in place for around eight years and every new insurer or product just got a new block of code because that was the way it had always been done. The alteration removed twenty-odd pages of code and the execution time was reduced to 32 seconds. This obviously had a major system impact. Sometimes the re-factoring process can be dramatic as this, but the most noticeable thing from a developer’s point of view is that the code is easier to read, maintain and debug.
System managers and users are always pressing to have system changes implemented yesterday, but if we rush them in, we can end up with slow, difficult to maintain code that becomes more unwieldy over time. Gradually, the time spent making changes and debugging the code becomes uneconomic. The systems support developer ceases to be effective and a complete system or sub-system rewrite is necessary, taking months of analysis and development.
There are systems and methodologies to do this re-factoring, but sometimes the simplest and cheapest way is for developers just to stop and think about it occasionally, on the fly, so to speak.
You can contact John at john@74stonelane.co.uk
[Something you’d like to get off your chest? Email me (Robin Jones) at eo@iap.org.uk.] -
VSJ – February 2008 – Work in Progress
Last November, Council member John Ellis, FIAP introduced us to his latest futurological thoughts and posed a few pertinent questions. Here he has a go at answering them.
Using The Technology – First Steps: Let’s imagine that we are at the point where we have the ability for a human being to interface with a room.
An immediate use of this technology is that the room might transmit Alpha wave patterns that could induce a state of calmness in its occupants, relaxing them and helping them sleep – very useful for insomniacs.
Any interface would allow more than one person to be interfaced to the room (or house or car) and potentially we now have the ability to communicate via a local network with each other. This may allow only for straight communication but could permit direct thought transference, creating a stronger understanding between users.
Using this technology in our cars would do away with the need for car keys – our own thoughts could act as the key. Perhaps cash points could do the same, providing an unrivalled level of security. And imagine houses that recognise their owners! Should someone unauthorised enter the house, the computer could contact the authorities.A Small Leap: In my previous articles I linked the person to the Internet. Now we have the ability to take our local network via an access port and connect to the wider WWW network and all the devices or other access ports connected.
So, for instance, I’m on my way home and realise that I’ve forgotten that it’s my wife’s birthday (I wouldn’t really, of course). I do a direct search to find the nearest florist that is open and ask them to prepare the bunch of flowers before I arrive, all done while I’m driving.
We would need to take precautions adding local security software to keep out the unwanted attentions of viruses and those trying to steal our bank details and so on. It may be that the next generation of firewalls would be able to filter on certain types/locations of queries and allow the harmless queries to pass.
This technology would now allow the direct bidirectional communication at a thought level between two people over any distance. We might call this pseudo-telepathy.A Slightly Bigger Leap: OK so we now have the ability to communicate with anyone in the world. What if the technology used at the access points could interface subconsciously with the person or people in a particular place or even places. This could allow us to ask a question and for the subconscious minds to reply. This could be quite spectacular. It brings to mind the novel ‘Destination Brain’ by Isaac Asimov, one of my favourite authors and a man ahead of his time.
There are security issues here too but I would expect the technology to limit access. After all, while it may be OK for me to think, ‘I want to make a Christmas cake’ and instantly Delia’s recipe is popped into my head, it is another thing altogether if I want to make nitroglycerine.
The local devices could now be used to rouse you by sending pleasant messages to wake up, instead of that annoying Beep, Beep, Beep of the alarm clock. They could even send you important information like appointments and birthdays while you are asleep, ready for you to act on in the morning. It could of course be used as a teaching tool, implanting the seeds of mathematics, science or even this new technology.The Big Leap: Having gone this far, what if people could become ‘farms’? Groups of people, conscious or otherwise, might become part of a collective brain, perhaps assigned a particular problem. The idea of a group consciousness might be used to allow doctors to investigate people’s health, both physical and mental, and a diagnosis formed, possibly even thought-directed treatments where someone in a coma could be communicated with at some basic level.
What if we could induce the same results with our animal friends? Perhaps the Chimpanzee, Dolphin or Mouse? Could we find a way to communicate with the animals even at a basic level of awareness? Would our views on the natural world change? What if talking to the flowers became thinking with them?
Misuse: A concern, of course, is that someone could be dragged off the street and their thoughts, bank account details and passwords extracted forcibly. Hang on though! We don’t have any account numbers or passwords – the bank knows who we are because of how we think.
There are obviously police and military intelligence uses here and I worry that they would be prime abusers of the technology. Would we be willing to allow them to access a person’s thoughts? This is just a new instance of the conventional tension between personal freedom and public safety. We want the first until it allows a terrorist atrocity. Then we’re more interested in the second.
More Technology: With all this additional usage the speed and bandwidth of the Internet would need to be enhanced to cope. Personal IP addresses for everyone would be required and the underlying protocols used for the Internet will need to be redefined.
Ethical Issues: Personally I do not think that that the idea of this technology is actually repulsive and I think I would take to it. I might not like the idea of surgery but if a non-invasive interfacing method were available then I would welcome this level of access.
We could, of course, end up in a ‘have and have not’ class system, with poorer people missing out.
Who would control the access points and virtual meeting places? Such questions need to be considered. Cyber Terrorists could in theory hijack the system and do unspeakable things, so we need to look at the safety protocols and how we act on these threats.Conclusion: The idea of us communicating electronically directly from the brain is not a fantasy but not quite yet a reality. Its possibilities could lead to a vast culture change with the human race; a collective consciousness may bring a level of mutual understanding that is unimagineable at the moment. What do you think?
You can contact John at john@74stonelane.co.uk.
[Interesting project or development? Let us know at eo@iap.org.uk!] -
VSJ – December 2007 – Work in Progress
Robin Jones describes the philosophy behind the new IAP Student Software Development Prize.
Over the last three decades, the Institution has been involved with a number of higher education bodies, both in the public and private sectors. It currently has formal partnership agreements with several UK universities.
We have been consistently impressed by the quality of the project work carried out by students at these universities. However, that’s not the impression a review of the comment pages of the technical press gives you. As a professional body, we are naturally concerned to develop and sustain understanding between education and industry. So the IAP Council has decided to establish an IAP Software Development Project Prize to be offered annually in participating departments in UK universities.
The prize is open to students (who must be eligible for IAP membership) on Honours Degree, Foundation Degree and Higher National Diploma courses that have software development as a major component. It may be offered to individuals or groups as appropriate.
A winning software solution will:- Have clearly demonstrated its usability in its target environment
- Provide exceptionally clear online help
- Have demonstrated its stability
- Be systematically documented
- Show a clear and effective test schedule
- Provide a novel solution
- In the case of a group project, demonstrate excellent teamwork
Only one prize will be available per course per year.
The value of the prize will be:- £100 (individual) or £40 per group member
- Free registration as an IAP member (value: £30) per individual or group member
- Free first year subscription as an IAP member per individual or group member
We want to make this scheme as widely available as possible but managing the entire assessment process centrally would be prohibitively expensive. And that’s where you come in. For each participating university or college, there will be a Fellow of the Institution who will act as Local Assessor. For instance, the Local Assessor at Plymouth, the inaugural university referred to in ‘Members’ News’, is Alastair Revell, Bsc (Hons) FIAP MBCS CITP of Revell Research Systems in Exeter. Incidentally, we’d like to thank Alastair for all the work he has done in helping us develop the scheme and in working with Plymouth thus far.
The operative word above is ‘local’. Since the assessor’s function is to liaise with the university department in choosing the project to be submitted to the IAP Council for ratification, it’s important that they don’t incur significant travelling time. If we match Fellows with universities carefully, we don’t anticipate that the task will be especially onerous and we will, of course, do whatever we can to support Fellows undertaking it. Given that, by definition, we have almost no experience of the scheme yet, it will be important to create a mechanism by which participating Fellows can easily share their knowledge – with us and each other – as it develops.
So if you’d like to get involved in improving industry-university links, contact me (eo@iap.org.uk) in the first instance, indicating which university you’d like to work with. If you already have a contact there, let me have their details so that I can send them information about the scheme. If not, that’s OK. We may well already have one.
If you were expecting part 2 of John Ellis’ futurological essay in this space, sorry but there wasn’t room for it this month. So it’s been held over until February’s edition.
[Interesting project or development? Let us know at eo@iap.org.uk!] -
VSJ – June 2007 – Members' News
Mike Ryan, the Director General, talks about changes to the IAP Council
June 1st sees the start of the Institution’s new administrative year 2007/2008. This is the date when five members of the Council stand down by rotation and are hopefully re-elected or replaced by even more enthusiastic new people. Three of our current members have agreed to stand for a further 3-year term. They are Selva Naidu, John Ellis and Siddique Khan.
After six years of service to the Council, Ray Butler has decided not to stand again this time owing to pressure of work at the London South Bank University. However, he may return some time in the future so there is hope that his valuable experience and cool judgement may not be lost to the Institution forever.
We are particularly sorry to be losing Steve Cumbers, one of the most active of our members since he joined the Council in 1998. Steve was responsible for establishing the very successful format of the IAP symposia at Trinity House, which he started in 1999. Steve was just a few days away from staging IAP2000 the following year, when he was catapulted into the post of IAP Vice President, following the sudden resignation of Alex Robertson. The Council were in no doubt that Steve was the man for the job and reinforced their message by creating him a Companion, in recognition of his services to the Institution.
Steve’s background is in the sciences and he has always had a particular interest in mathematics. For more than 20 years he worked in Investment Banking and Treasury Operations. He has taught courses on financial derivatives and informatics and was Chairman of the Association of Independent Computer Specialists. Now, however, in a stunning redirection of his career, he is retraining for the medical profession. He expects to complete his medical degree at UCL – “a very demanding course”, he says – by 2011. We certainly wish him well and perhaps when he is qualified he will consider providing a bespoke medical service for IAP members!
Unfortunately, the departure of Steve and Ray has left us with two vacancies still to be filled. When we called for nominations earlier this year nobody had stepped forward by the closing date. IAP members seem to be very busy people – I wonder why! So as on some previous occasions it will be up to this year’s Council to co-opt people if and when they see fit.
[Don’t forget to email eo@iap.org.uk with items of news about you or your company.] -
VSJ – July 2006 – Sounding Board
Council member John Ellis, FIAP wrestles with his email and asks for your suggestions.
Backing up my PC recently, I ran out of space on the backup device. Usually I would have purchased a bigger hard drive and offloaded the data to the new box. Having some time on my hands, though, I thought I would look at what was eating all the space. One of the largest chunks was my email folders. Not surprising, with over 10 years’ worth of emails in them.
I’ve always maintained my folders under broad headings and then sub-divided them as appropriate. This seems a sensible set-up and has worked for many years (hence the glut of data). I know I could archive old data or my dead clients but I always think, “well it might be useful, so I’ll keep it online”.
Three points then occurred to me:- Surely keeping even an email beyond a client’s life is contravening the Data Projection Act, so unless there is a legal reason why I should keep it, once the client has gone so should their emails.
- I’m sent content for clients’ Web sites, which are applied straight away, so why keep the source data beyond a few weeks? Delete the emails immediately the work is completed. Files like images will be stored on the Web site anyway so there’s automatically a backup.
- I keep snippets of code, links to Web sites with articles of interest, and so on. This sort of data has a limited life. So I probably should create folders that either have a ‘safe deletion’ date or else name the folders in a way that allows me to quickly identify what I can delete.
I’ve been going through my email folders gradually removing dead items, but it is time consuming, so I’m now setting my folders up with a bit more care and filing the emails as they come in. The email folder is already down 23% and still falling. I wonder if others have better ways of managing their email stores?
You can contact John at john.ellis@wellis-technology.co.uk
[Something you’d like to get off your chest? Email me (Robin Jones) at eo@iap.org.uk.]