Tuesday, September 23, 2008

Software Freedom Day: This weekend!

The Software Freedom Day is coming up this weekend, on Saturday the 20th of September. SFD is an annual and international event, with around 500 teams in 90 countries organising local events. The biggest one in New Zealand will be in Wellington, but ... (more in the full post)
Read More

Currently at $2.70 these are a bargain!

Once all the idiots have sold out the better things will be for Telecom. Telecom is not the company it was 4 years ago. It is a lean mean telco run by a very smart CEO that will continute to dominate the NZ landscape for many many many years to come.

Lets hope the investors bailing out at present are the idiots who only invested in the company for their dividend - greedy people who were only ever in it for a quick return. Those same people who pushed for Telecom to make big profits and caused the problems that are now the stumbling block the company is having to deal with.

Those buying shares now are the smart ones who realise that a bargain they are getting...
Read More

The Big Bang is still holding off its illusive secrets for a while long as it was annouced recently there has been more issues delaying the much anticipated collisions in the collider.

Although it was hoped the problem could be resolved quickly, it turns out that unfortunately, it won't be as quickly as hoped.

whilst the anticipated time to fix the problem is tiny compared to how long its taken to get this far - 2 months at least to fix the problem - but we've been waiting in anticipation for this project to complete for 20 years now,

but two months just seems a very long time to all the people waiting for results.

so what was the issue at hand?
Well the symptom was:
<>They had to shut down the LHC when temperatures rose by rought 100°C
causing around 1000kg of liquid helium to leak into the tunnel.

This sounds like a hell of a lot, but you need to remember, your average room temperature of 23°C is roughly 294°c hotter than the temperature they run the LHC at, so when the temperature rose by 100°C it was still pretty chilly in there (around  -171.15 °C)

The LHC runs near absolute zero, absolute Zero is defined as 0 kelvin, with the LHC running at 2 kelvin, which is -273.15 and -271.15 degrees celsius respectively)

The actual problem has not be precisely defined, they anticipate that it was caused by a faulty electrical connection between two magnets that stopped superconducting - then melted, which caused a mechanical failure leading to the helium getting out.

So we'll be waiting a while longer for the really intensive experimentation to begin, that said, if the doomsayers are right, or the scientists get the equivalent of winning (some percentage of all the lottery competitions in the world on the same night that I saw being bandied around as the chances) then we get a couple more months on this Earth.

Although at least if you are in NZ, apparently based on a number of calculations, if the highly unlikely event of a large enough singularity does occur, it'll take a wee while to grow big enough to destroy the planet so we'll get plenty of warning!
Read More

Functionality may appeal to bailout-seeking financial services firms Read More

Over the past six years a lot of changes have happened in the world of virus removal and computer security. Basically with the advent of Microsofts most secure operating system ever (Windows XP) the world of virii / spyware / rootkits / exploits (collectively I'll refer to them as malware for this article) etc has exploded. Estimates of the number of new virus released each week are normally in the thousands, if not tens of thousands range. Add to that spyware, unwanted applications (e.g. WinAntivirus2008) , trojans, adware ... and the listis enormous.

Recently I have found it is getting harder to ensure that the latest malware definitions are really catching the latest problems. In the last three weeks we have found several pieces of malware or viruses which have had to be carefully hand removed - with new definitions to detect them coming out 2 to 4 days after we have already discovered them.

Malware detection always lags behind the advent of new malware as a new virus / spyware / trojan etc, when it is released, normally has at least several hours, if not days, head start on the first definitions being released. In order for a new definition to be released the Malware has to be noticed, caught, reported, analysed and finally a fix / detection signature released for it. Finally the update has to be downloaded by the end user.

Part of the process we employ when doing a "Virus Bust" is to run several anti spyware / malware removal and root kit detectors across a system. This of course is quite time consuming, and again - if the malware is a new one, sometimes the only way it is detected is by seeing the results of the malware still present (e.g. rubbish exiting the firewall, strange PC behavious, pop ups etc). Which started me thinking ....

Is it possible that the number if items of legetimate software on the average users PC is growing at a slower rate than the number of malware instances. For example, the average user only wants to surf the net, send emails, write letters, do some word processing and listen to music / videos. Throw into that mix a bit of spreadsheeting, VOIP and games and you are stil only looking at a fairly limited range of software.

On an average week the average user does not add much new software to a system. Microsoft updates and anti virus updates probably account for most of the changes to executable code on a system. Instead of scanning for  malware maybe a better solution would be to have a list of known good executable software and run a scan based on that.  Any executable code found on a system not in the known good DB can then be flagged as suspicious and that subset of files be scanned / isolated instead of scanning an entire system of mostly good code for the odd piece of rot that has crept in.

Security based not on positive detection of malware but the isolation of unknown code offers a chance to allow quicker detection of potentially dangerous software on a PC. Certify the good code, isolate the unknown code and then apply positive antivirus detection methods against the unknown executables.

Not only does this method have the possibilty of being faster in its scanning of systems (creating and checking hashes is potentially faster than applying heuristic algorithms against an entire executable) but means the ability to certify code as being safe might alleviate some of the Zero Hour threats we face now days. Certainly for someone like me isolating the known good from the unknown means we can rapidly discard 99% of all files in a system has safe and concentrate on isolating the threats in the unknown one percent. It also offers a very positive way of providing reliable scanning from an alternative boot disk on compromised systems.

Historically old anti virus systems (circa DOS and Windows 3.1) were able to add CRC codes or hashes to executable files and then check to see files matched a known hash. That method presents problems today, and has fallen out of favour. However as an off-line virus scan, booted from an alternative operating system or boot disk and making use of a 'white list' database, it has the potential to add another tool to the security experts arsenal.

Heaven knows we need it.

This has been a random thought from the fertile and over caffinated brain of Shane. Thoughts, feed back and offers of millions for the idea welcome.




Read More

The phone features many Google applications, and is integrated with the Amazon MP3 store Read More

No comments: