Don't be late for the Friday night drinks as there is a bunch of free Epic Pale Ale. (Thank Mike Forbes for hooking that up, but you might want to get there before Mike as he has a big thirst when it comes to Epic Pale Ale)
http://www.kiwicon.org/
http://www.epicbeer.com
Twitter @epicbeer Read More
Ihug backs Comcom on 'bill and keep'; and well done Silverstripers Read More
So after a relatively painfree experience with an Orcon Homehub at work I decided to get one for home.
My main motivation was my wondering if my DLink DSL-502T & a WRT54G modded to run DD-WRT were causing double NAT issues and resulting in slow connections/torrents. You would imagine an all in one solution would at least half the possible causes for slow speeds. And since upgrading so far so good - more green lights than amber in Azureus/Vuze.
But as I've come to get used to the Homehub, I was hoping to get a feature of Leopard/MobileMe working - Back to My Mac.
The general use is to setup a zero configuration VPN between your various Macs that aren't all in the same location. If this thing
actually worked it would be an awesome. Remote Desktop/VNC, File Sharing - even Bonjour support all with no complex setup. The problem is it only works in ideal configutations - and most of those are centred around America's cable connections - and not an ADSL setup in NZ.
Back to the Homehub - although offering uPnP, it would seem to be lacking true NAT-PMP support. Via this Apple Discussion, I've found that looks like the same issues but on a BT Homehub (which is in no way the same box as Orcons).
OK, so the problem is that the BT Voyager 2100 only supports the WANPPPConnection service of UPnP, while Apple currently only supports the WANIPConnection service. There's a chance Apple could add WANPPPConnection support in a future release, but until then you're out of luck.When I used Lighthouse, a dynamic port forwarding utility for OS-X, to give me some indepth info, it reports -
Lighthouse could not associate with the router '192.168.1.1' because of invalid protocol implementations
The router '192.168.1.1' sent invalid responses to Lighthouse's requests to associate with it. This may happen if your router doesn't support NAT/PMP or UPnP (please check its manual) or because their implementation is incomplete. It may help to upgrade your router to its latest firmware version (information on how to do this should also be present in the router's manual).
So my options would be to loose the Homehub, replacing it with a NAT-PMP supporting ADSL2+ Modem, ideally that supports Bridge mode - this turns it into a dumb modem that forwards everthing through to your routers WAN port - including login and authentication - and hopefully NAT/uPnP. The I can use my Airport Extreme to get proper NAT-PMP.
I did some reading and it would seem that the newest revision of the DLink DSL-502T does this - look out for Revision C.
Has anyone out there managed to get a bridged modem connection working into their router and if so what gear are you using?
Also - Orcon seem to be shipping a 2nd revision of the Homehub, its still based on a Siemens SX763, the new model has a WPS button on the back to allow for easy Wireless pairing (if you're other gear supports it). The new Homehub also has a Orcon Logo screened on the top and theres no standard Siemens packaging - just a Orcon branded slip around a white box.
Read More
Windows Server and SQL Server added to EC2 Read More
Xero to go head-to-head with MYOB Read More
Over the past six years a lot of changes have happened in the world of virus removal and computer security. Basically with the advent of Microsofts most secure operating system ever (Windows XP) the world of virii / spyware / rootkits / exploits (collectively I'll refer to them as malware for this article) etc has exploded. Estimates of the number of new virus released each week are normally in the thousands, if not tens of thousands range. Add to that spyware, unwanted applications (e.g. WinAntivirus2008) , trojans, adware ... and the listis enormous.
Recently I have found it is getting harder to ensure that the latest malware definitions are really catching the latest problems. In the last three weeks we have found several pieces of malware or viruses which have had to be carefully hand removed - with new definitions to detect them coming out 2 to 4 days after we have already discovered them.
Malware detection always lags behind the advent of new malware as a new virus / spyware / trojan etc, when it is released, normally has at least several hours, if not days, head start on the first definitions being released. In order for a new definition to be released the Malware has to be noticed, caught, reported, analysed and finally a fix / detection signature released for it. Finally the update has to be downloaded by the end user.
Part of the process we employ when doing a "Virus Bust" is to run several anti spyware / malware removal and root kit detectors across a system. This of course is quite time consuming, and again - if the malware is a new one, sometimes the only way it is detected is by seeing the results of the malware still present (e.g. rubbish exiting the firewall, strange PC behavious, pop ups etc). Which started me thinking ....
Is it possible that the number if items of legetimate software on the average users PC is growing at a slower rate than the number of malware instances. For example, the average user only wants to surf the net, send emails, write letters, do some word processing and listen to music / videos. Throw into that mix a bit of spreadsheeting, VOIP and games and you are stil only looking at a fairly limited range of software.
On an average week the average user does not add much new software to a system. Microsoft updates and anti virus updates probably account for most of the changes to executable code on a system. Instead of scanning for malware maybe a better solution would be to have a list of known good executable software and run a scan based on that. Any executable code found on a system not in the known good DB can then be flagged as suspicious and that subset of files be scanned / isolated instead of scanning an entire system of mostly good code for the odd piece of rot that has crept in.
Security based not on positive detection of malware but the isolation of unknown code offers a chance to allow quicker detection of potentially dangerous software on a PC. Certify the good code, isolate the unknown code and then apply positive antivirus detection methods against the unknown executables.
Not only does this method have the possibilty of being faster in its scanning of systems (creating and checking hashes is potentially faster than applying heuristic algorithms against an entire executable) but means the ability to certify code as being safe might alleviate some of the Zero Hour threats we face now days. Certainly for someone like me isolating the known good from the unknown means we can rapidly discard 99% of all files in a system has safe and concentrate on isolating the threats in the unknown one percent. It also offers a very positive way of providing reliable scanning from an alternative boot disk on compromised systems.
Historically old anti virus systems (circa DOS and Windows 3.1) were able to add CRC codes or hashes to executable files and then check to see files matched a known hash. That method presents problems today, and has fallen out of favour. However as an off-line virus scan, booted from an alternative operating system or boot disk and making use of a 'white list' database, it has the potential to add another tool to the security experts arsenal.
Heaven knows we need it.
This has been a random thought from the fertile and over caffinated brain of Shane. Thoughts, feed back and offers of millions for the idea welcome.
Read More
Wellington Tweetup starts tonight at 5pm, at the Malthouse http://www.themalthouse.co.nz
I have shouted a few free beers - Epic Pale Ale
More Info
If you go send me a tweet or upload a pic to flickr so I can see what I have missed out on.
Cheers
Luke
http://www.epicbeer.com Read More
No comments:
Post a Comment