Writing Blog

May 28, 2016

The Antikythera Laptop

Filed under: Computers,Gadgets,History,Internet,Science,Software,Technology — Rafael Minuesa @ 1:43 PM
Tags: ,
Antikythera Laptop Greek salesgirl showing the latest model of the Antikythera Laptop to Sosipatra. (notice her smile anticipating the sale)

The Antikythera Laptop was an ancient computer powered by an analog mechanism that consisted of a box with dials on the outside and a very complex assembly of bronze gear wheels mounted in the inside.

Antikythera smart wristwatch A prototype for a Antikythera smart wristwatch that didn’t make it to the manufacturing line

The computer didn’t do much apart from accurately computing the time it takes for planetary bodies to complete their orbits, but that was quite an unprecedented feat at the time that could not be replicated until the development of mechanical astronomical clocks in the fourteenth century. Besides Ancient Greeks favored Theater over Video any day of the week, so there was no need for a Graphic Card either, they were happy just watching the dials go round and round, which provided them with an infinite source of inspiration to come up with all kinds of theorems.

In latest models the Antikythera Laptop featured just 2 USB-G ports, a decision that was highly criticized by users who could not understand why they had to part with a substantial amount of extra Drachmas to buy an adapter, just to get their machines connected to other existing standard devices of the time.

The laptops were manufactured using very sturdy materials, which made them extremely hard and durable. They were also waterproof as long as the machinery was not underwater for more than a year or so. They came with a life-time warranty with accidental damage protection. They were definitely different times and different code of ethics back then.

Rusty Antikythera Mechanism This one had the warranty voided after more than 2,000 years under sea water.

March 7, 2010

Flat-file based PHP CMSs

This is one of a series of articles I posted at my Web Development Blog.
You can view the original version at:

After several unsuccessful attempts to access the MySQL databases that provide the data for the web pages of my main client, I have decided to look into other options that do not require a separate MySQL database that returns errors far too often, with a separate password that nobody knows who’s got it or how to retrieve it, and a separate set of skills outside the realm of most web designers.

To my surprise, since last time I checked on flat-file based PHP Content Management Systems, there have been emerged/developed quite a few (and good) options around this approach that I’m listing below:

CMSimple is simple, small and fast. It writes page data directly to a HTML file on the web server. Configuration files and language files are also saved in .txt format.
CMSimple also offers a wide variety of plug-ins, including many made by third parties.

DokuWiki is wiki software that works on plain text files and thus needs no database. Its syntax is similar to the one used by MediaWiki and its data files remain readable outside the wiki.
It has a generic plugin interface which simplified the development and maintenance of add-ons.
DokuWiki is included in the Linux distributions of Debian and Gentoo Linux.

Pluck allows for easy web page creation for users with little or no programming experience, and does not use a database to store its data.
Pluck also includes a flexible module system, which allows developers to integrate custom functionality into the system. Pluck includes 3 default modules: Albums, Blog and Contact form.

razorCMS is designed to be as small as possible ( around 300 KB including a WYSIWYG – Editor), just enough to be useful on a base install. Then extra functionality can be added as needed via the Blade Pack management system.
Skeleton CMS isn’t really a CMS as much as it is a very simple framework for rapid prototyping. If nothing more, it’s a good structured site model to start building a website with. There is no need for a database and no fancy admin area, but if you’re building a site for a client and you don’t need the power of WordPress this might be exactly what you’re after.

As pointed out, the main advantage of using flat-file (text) files as a database for a PHP driven CMS is that you no longer depend on external software to edit and maintain a part of the CMS, without which the system simply will not run.

But you must keep in mind that although flat-file files are an acceptable solution for small databases, they become sluggish as the database grows because access mode is sequential.
Another disadvantage is their inability to support transactions and probably the biggest concern is Security: A database protects the data from outside intrusion better than a flat file because it provides a security layer of its own.
Having said that, NOTHING that is hosted on a server connected to the Internet is secure and if there are hackers equipped with enough resources who are intent in breaking into your system, they eventually will.

An intermediate solution would be the use of SQLite an Open Source embedded relational database management system contained in a small C programming library that when is linked in it becomes an integral part of the program. The entire database including definitions, tables, indices, and the data itself are stored as a single text file on a host machine.
SQLite is embedded into a growing number of popular applications, such as Mozilla Firefox or Google’s Android OS.

Sounds good to me.
Below some of the CMSs that are able to use SQLite to store their data:

eoCMS (everyone’s Content Management System) uses MySQL or SQLite and php to deliver content in a user-friendly interface.
It features a forum, moderation features, custom 404 pages, personal Messaging , plug-ins, RSS output, ratings, etc.

Frog CMS is an extendable open source content management system designed to use PHP5 along with a MySQL database backend, although it also has support for SQLite. It is actually a port of Radiant, the Ruby on Rails CMS, although Frog has begun to take its own development direction.

FUDforum supports all the standard features you may come to expect from a modern forum with a robust MySQL, PostgreSQL or SQLite back-end, that allows for a virtually unlimited number of messages (up to 2^32 messages).

Habari is a modular, object-oriented blogging platform that supports Multiple users, Multiple sites, Plugins, Importers for Serendipity and WordPress, etc.
Habari prides itself in being standards compliant and more secure than other blogging platforms by making use of PDO and enabling prepared statements for all interactions with the database.

Jaws is a CMS and modular framework that focuses on User and Developer “Friendliness” by providing a simple and powerful framework to hack your own modules.
Lanius CMS comes out of the box with two flatfile database choices (Gladius DB and SQLite), that will work out-of-the-box with both PHP4 and PHP5.

phpSQLiteCMS is based on PHP and SQLite and runs “out of the box”. phpSQLiteCMS uses PDO as database interface, which makes it also possible to run with MySQL.

Serendipity is a PHP based blog and web-based content management system that supports PostgreSQL, MySQL, and SQLite database backends, the Smarty template engine, and a plugin architecture is kept updated by automatically checking the plugin repository online.

The only drawback to this approach in many of the systems above is that they use the PHP Data Objects (PDO) extension, a lightweight, consistent interface for accessing databases in PHP, that although greatly reduces the system’s vulnerability to SQL injection attacks, it does require the new OO features in the core of PHP 5, and so will not run with earlier versions of PHP.

February 3, 2010

What is the Google Affiliate Network?

Filed under: Blogs,Internet,Software — Rafael Minuesa @ 1:46 AM
Tags: , , ,
This is one of a series of articles I posted for the 1000 Webs’ Blog.
You can view the original version at:

When Google acquired DoubleClick in March 2008, it also acquired its affiliate ad network program called Performics, the first full-service affiliate network founded in 1998 and that in turn had been acquired by DoubleClick in 2004. Google has now further developed and rebranded Performic as the Google Affiliate Network.

The Google Affiliate Network works like any ordinary affiliate ad network, by enabling advertising relationships between publishers and advertisers, whereby publishers get paid for every successful sale transactions that their site brings to advertisers.

As a Google Affiliate Network publisher, you can add an advertiser’s banner or text link on your site. When a transaction, such as a sign-up or purchase, occurs through one of these affiliate links, Google Affiliate Network will track the sale and pay you a commission or bounty.

Someone clicks the ad on your site…
…buys the advertised product…
…and you receive a commission on the sale
 

The Google Affiliate Network has been integrated into Google AdSense. All Google Affiliate Network publishers must accept AdSense terms. Additionally, all earnings are distributed through AdSense.
But being a Google AdSense publisher does not make you automatically a publisher in Google Affiliate Network. You must complete a separate application for Google Affiliate Network.
In order to join the program, you need to apply to their network in two steps:
Step 1: Link to or apply for a Google AdSense account.
Step 2: Tell Google about your site and promotional methods.

Each application is reviewed by the Google Affiliate Network quality team which will check some requirements, such as being a site that attracts a desirable audience for the products offered, able to test advertising offers and nurture the most productive relationships, being an expert in driving and converting visitor traffic and adhere strictly to Google Affiliate Network quality standards and advertiser policies.
In addition Google states that,

we’ve found that Google Affiliate Network tends to yield greater benefits to publishers who create niche content, manage loyalty and rewards programs, aggregate coupons and promotions, or manage social media.

Payments are processed on a cost-per-action (CPA) basis, typically as a revenue share or fixed bounty for a lead or other action. Google Affiliate Network earnings will be posted to your Google AdSense account approximately 30 days after the end of every month.


More Info:

October 13, 2008

The Integrator

Filed under: Computers,Software — Rafael Minuesa @ 2:31 PM
Tags: , , , , , , ,

Integration of Visual Studio and Expression Blend through XAML

Extensible Application Markup Language (XAML)

XAML or Extensible Application Markup Language (pronounced zammel [ˈzæmɫ̩]) is a declarative XML-based language created by Microsoft which is used to initialize structured values and objects.

XAML is used extensively in .NET Framework 3.0 technologies, particularly Windows Presentation Foundation (WPF) and Windows Workflow Foundation (WF).
In WPF, XAML is used as a user interface markup language to define UI elements, data binding, eventing, and other features. In WF, workflows can be defined using XAML.

XAML elements map directly to Common Language Runtime object instances, while XAML attributes map to Common Language Runtime properties and events on those objects.

XAML files can be created and edited with visual design tools such as Microsoft Expression Blend, Microsoft Visual Studio, and the hostable Windows Workflow Foundation visual designer.
They can also be created and edited with a standard text editor, a code editor such as XAMLPad, or a graphical editor such as Vector Architect.

XAML represents a bridge between the designer and developer teams.
A new role has emerged as the result of this fusion, that Paul Alexander, a technical program manager with IdentityMine, calls the integrator:

“The Integrator understands the needs of the developer while also supporting the needs of the designer to assure that the app’s UI is as compelling as it was designed, while also validating that the concepts can be realized in code from the developer.”

The integrator deals mostly with XAML code and provides an interface between the developer and designer, by structuring and modularizing the XAML.

Therefore, the ideal integrator must posess strong design skills and a thorough understanding of XAML and Windows Presentation Foundation (WPF) concepts such as inheritance, styles, and resource lookup.

The Designer<->Integrator<->Developer Model allows the design team to leave the XAML unattended and focus on having their assets effectively integrated into the project.
Designers can work with tools such as Expression Design, Inkscape or Adobe Illustrator and output the results as XAML.
The integrator then integrates the XAML into the project and passes it on to the developers, who need not to be concerned with design issues.
Obviously, this model also works perfectly well the other way around, and in some cases it is advisable to have the developer team establish the foundations of the project.
Expression Blend, by Microsoft, makes this transition even easier, by accepting and generating XAML code that can be directly imported/exported from/to Visual Studio.

More Info:

March 3, 2008

SPAM on Usenet

This is one of a series of articles I posted for magiKomputer.
You can view the original version at:
* http://magikomputer.blogspot.com/2008/03/spam-on-usenet.html

From Wikipedia, the free encyclopedia:

“Spamming is the abuse of electronic messaging systems to indiscriminately send unsolicited bulk messages. While the most widely recognized form of spam is e-mail spam, the term is applied to similar abuses in other media: instant messaging spam, Usenet newsgroup spam, Web search engine spam, spam in blogs, wiki spam, mobile phone messaging spam, Internet forum spam and junk fax transmissions.

Spamming is economically viable because advertisers have no operating costs beyond the management of their mailing lists, and it is difficult to hold senders accountable for their mass mailings. Because the barrier to entry is so low, spammers are numerous, and the volume of unsolicited mail has become very high. The costs, such as lost productivity and fraud, are borne by the public and by Internet service providers, which have been forced to add extra capacity to cope with the deluge. Spamming is widely reviled, and has been the subject of legislation in many jurisdictions.”

Spam affects about everybody that uses the Internet in one form or another. And in spite of what Bill Gates forecasted in 2004, when he said that “spam will soon be a thing of the past”, it is getting worse by the day. While the European Union’s Internal Market Commission estimated in 2001 that “junk e-mail” cost Internet users €10 billion per year worldwide, the California legislature found that spam cost United States organizations alone more than $13 billion in 2007, including lost productivity and the additional equipment, software, and manpower needed to combat the problem.

Where does all that Spam come from? Experts from SophosLabs (a developer and vendor of security software and hardware) have analyzed spam messages caught by companies involved in the Sophos global spam monitoring network and came out with a list of top 12 countries that spread spam around the globe:

  • USA – 28.4%;
  • South Korea – 5.2%;
  • China (including Hong Kong) – 4.9%;
  • Russia – 4.4%;
  • Brazil – 3.7%;
  • France – 3.6%;
  • Germany – 3.4%;
  • Turkey – 3.%;
  • Poland – 2.7%;
  • Great Britain – 2.4%;
  • Romania – 2.3%;
  • Mexico – 1.9%;
  • Other countries – 33.9%

There are many types of electronic spam, including E-mail spam (unsolicited e-mail), Mobile phone spam (unsolicited text messages, Messaging spam (“SPIM”), use of instant messenger services for advertisement or even extortion, Spam in blogs (“BLAM”), posting random comments or promoting commercial services to blogs, wikis, guestbooks, Forum spam (posting advertisements or useless posts on a forum, Spamdexing, manipulating a search engine to create the illusion of popularity for web pages, Newsgroup spam, advertisement and forgery on newsgroups, etc.

For the purpose of this post we shall focus on Newsgroups spam, the type of spam where the targets are Usenet newsgroups.
Usenet convention defines spamming as excessive multiple posting, that is, the repeated posting of a message (or substantially similar messages). During the early 1990s there was substantial controversy among Usenet system administrators (news admins) over the use of cancel messages to control spam. A cancel message is a directive to news servers to delete a posting, causing it to be inaccessible to those who might read it.
Some regarded this as a bad precedent, leaning towards censorship, while others considered it a proper use of the available tools to control the growing spam problem.
A culture of neutrality towards content precluded defining spam on the basis of advertisement or commercial solicitations. The word “spam” was usually taken to mean excessive multiple posting (EMP), and other neologisms were coined for other abuses — such as “velveeta” (from the processed cheese product) for excessive cross-posting.
A subset of spam was deemed cancellable spam, for which it is considered justified to issue third-party cancel messages.

The Breidbart Index (BI), developed by Seth Breidbart, provides a measure of severity of newsgroup spam by calculating the breadth of any multi-posting, cross-posting, or combination of the two. BI is defined as the sum of the square roots of how many newsgroups each article was posted to. If that number approaches 20, then the posts will probably be canceled by somebody.


The use of the BI and spam-detection software has led to Usenet being policed by anti-spam volunteers, who purge newsgroups of spam by sending cancels and filtering it out on the way into servers.

A related form of Newsgroups spam is forum spam. It usually consists of links, with the dual goals of increasing search engine visibility in highly competitive areas such as sexual invigoration, weight loss, pharmaceuticals, gambling, pornography, real estate or loans, and generating more traffic for these commercial websites.
Spam posts may contain anything from a single link, to dozens of links. Text content is minimal, usually innocuous and unrelated to the forum’s topic. Full banner advertisements have also been reported.
Alternatively, the spam links are posted in the user’s signature,where is more likely to be approved by forum administrators and moderators.
Spam can also be described as posts that have no relevance to the threads topic, or have no purpose in general (e.i, a user typing “CABBAGES!” or other such useless posts in an important news thread).

When Google bought the Usenet archives in 2001, it provided a web interface to text groups (thus turning them into some kind of web forums) through Google Groups, from which more than 800 million messages dating back to 1981 can be accessed.
There are some especially memorable articles and threads in these archives, such as Tim Berners-Lee’s announcement of what became the World Wide Web:
http://groups.google.com/groups?selm=6487%40cernvax.cern.ch
or Linus Torvalds’ post about his “pet project”:
http://groups.google.com/groups?selm=1991Oct5.054106.4647%40klaava.Helsinki.FI
You can view a pick of the most relevant posts here:
http://www.google.com/googlegroups/archive_announce_20.html

But Google Groups are responsible for the higher proportion of the spam that floods the Usenet nowadays. Google Groups isn’t the only source, but is the one that makes it easier for spammers to carry out their irritating activities.
It’s so easy to spam Usenet through Google Groups that there are some infamous spammers who have been doing so for years. Perhaps the best known of all is the MI-5 Persecution spammer who gets his way across just about any other newsgroup with rambling postings that often appear as clusters of 20 or more messages all related to Mike Corley’s perceived persecution of himself by MI5, the British intelligence agency. This UK-based spammer readily admits that he suffers from mental illness in several of his postings. He annoys the rest of users in such an exasperating way, that some of them have even offered themselves to the MI-5 to personally finish off the job.

The solution, IMHO, is to implement the Breidbart Index in Google Groups. It would be an easy task for a company that excels at implementing all kinds of algorithms in their search engine, that I just can’t understand what are they waiting for.

http://www.youtube.com/watch?v=XZ6N5m8FpVg

More Info:
Newsgroup Spam

February 23, 2008

And the Winner is … Google

Filed under: Internet,Software — Rafael Minuesa @ 12:55 AM
Tags: , , , , , , , , , , , , , ,
This is one of a series of articles I posted for the 1001webs’ blog.
You can view the original version at:
* http://1001webs.blogspot.com/2008/02/and-winner-is-google.html

On Feb 1 Microsoft offered to buy Yahoo! for $31 per share, a deal that was valued at $44.6 billion, in an attempt to acquire assets that would allow MSN to become a real competitor to Google’s supremacy on the Internet.

Microsoft justified its interest in acquiring Yahoo! explaining that:

“The industry will be well served by having more than one strong player, offering more value and real choice to advertisers, publishers and consumers.”

Yahoo! would certainly add some very valuable assets to Microsoft’s Internet Division, such as an audience of more than 500 million people per month in sites devoted to news, finance and sports, or Yahoo Mail (the most widely used consumer e-mail service on the Internet) or web banner ads used by corporate brand advertisers.
Although the price was a 62% premium above the closing price of Yahoo! common stock of $19.18 on January 31, 2008, it was only about a quarter of what Yahoo was worth in 2000, and the company’s board finally rejected the offer two weeks ago because they felt they were being undervalued at $31 a share. Or at least that’s what they said.

At a conference at the Interactive Advertising Bureau on Monday, Yahoo chief executive Jerry Yang had the chance to provide their own version of the story.
Yang broke the ice with a “Before you start, let me guess what your first question is. Does it start with an M and end with a T?”
However he did not elaborate much further:

“Everyone has read what we are doing, so there is not much to report. We’re taking the proposal that Microsoft has given to us seriously. It’s been a galvanizing event for everyone at Yahoo. Our board is spending a lot of time thinking about all the alternatives. It’s something that we need to think through carefully.”

But Microsoft is not be put off so easily and has recently hired a proxy firm to try to oust Yahoo’s board.
Last Friday, Microsoft released an internal memo from Kevin Johnson, President of Microsoft’s Platforms & Services Division, where he actually sees the deal going through:

“While Yahoo! has issued a press release rejecting our proposal, we continue to believe we have a full and fair proposal on the table. We look forward to a constructive dialogue with Yahoo!’s Board, management, shareholders, and employees on the value of this combination and its strategic and financial merits.
If and when Yahoo! agrees to proceed with the proposed transaction, we will go through the process to receive regulatory approval, and expect that this transaction will close in the 2nd half of calendar year 2008. Until this proposal is accepted and receives regulatory approval, we must continue to operate our business as we do today and compete in this rapidly changing online services and advertising marketplace.
It is important to note that once Yahoo! and Microsoft agree on a transaction, we can begin the integration planning process in parallel with the regulatory review. We can create the integration plan but we cannot begin to implement it until we have formal regulatory approval and have closed the transaction. Because the integration process will be critical to our success as a combined company, we are taking this very seriously. “

On the other hand, Google is not standing idle among other obvious reasons because it owes one to Microsoft from when the latter interfered with Google’s purchase of DoubleClick last year.
Google’s chief legal officer, David Drummond, wrote in the The Official Google Blog:

“Microsoft’s hostile bid for Yahoo! raises troubling questions. This is about more than simply a financial transaction, one company taking over another. It’s about preserving the underlying principles of the Internet: openness and innovation.
Could Microsoft now attempt to exert the same sort of inappropriate and illegal influence over the Internet that it did with the PC? While the Internet rewards competitive innovation, Microsoft has frequently sought to establish proprietary monopolies — and then leverage its dominance into new, adjacent markets.
Could the acquisition of Yahoo! allow Microsoft — despite its legacy of serious legal and regulatory offenses — to extend unfair practices from browsers and operating systems to the Internet? In addition, Microsoft plus Yahoo! equals an overwhelming share of instant messaging and web email accounts. And between them, the two companies operate the two most heavily trafficked portals on the Internet. Could a combination of the two take advantage of a PC software monopoly to unfairly limit the ability of consumers to freely access competitors’ email, IM, and web-based services? Policymakers around the world need to ask these questions — and consumers deserve satisfying answers.”

In any case, it remains unclear how the situation will develop and where it will lead to. Google probably can’t stop the deal, but it can delay it considerably, and the delay will certainly act in Google’s interests.

“In the interim, we foresee disarray at Microsoft and Yahoo, We believe the deal has distracted the engineers and should benefit Google over the next 18 to 24 months, providing with a major opportunity to advance in branded advertising.”

as foreseen by analyst Marianne Wolk of Susquehanna Financial Group.
According to Wolk,

“If instead Microsoft is forced to acquire Yahoo via a proxy fight, it would mean a more protracted closing process, then the transaction will not close until early 2009, when it would begin the complex integration of Yahoo’s 14,300 employees, multiple advertising platforms, technology infrastructures, content sites, culture, etc.
Google may not face a more competitive Microsoft-Yahoo until 2010.”

By then, she said, Google could “extend its lead in search monetization” and grab a “major lead in emerging growth areas, such as video advertising, mobile and local advertising.”
Wolk also pointed out that Google would likely find it easier to hire top engineers from Microsoft and Yahoo “as they fear for their jobs in a consolidation.”

My personal bet is that even if the deal goes ahead and Microsoft pours in huge loads of money and resources, it won’t work.
And it won’t because Microsoft will try to apply the same tactics that it has applied to gain dominance over the PC market, i.e. trying to force every user to use their software.

The Internet is totally different. You can’t force people to use your staff. You have to convince them to use it. And in order to do that you have to provide a superior product. Neither MSN nor Yahoo! come even closer to what Google delivers in terms of search results and applications designed for the web.

Much needs to be improved in both MSN and Yahoo! in order to be able to compete with Google.
In the case of Yahoo is a technical issue. I have recently switched from Google to Yahoo’s search engine just to see how accurate the results were and I had to switch back because the difference with Google’s is abysmal, both in accuracy and quality of results.
I kind of feel sorry for Yahoo! because I’ve been a long time user of their services and I can see it going down the gutter, no matter what the final result of the acquisition will be. They have some top-quality services such as Yahoo! Mail or Yahoo! Finance and in many countries in Asia Yahoo! is a real competitor to Google, but they need to innovate so much that I doubt they will ever revert the downward trend.
They are moving in the right direction now with Web 2.0, but I’m afraid that it might be too late.
They have recently announced that they are opening up their Search to third party so that everybody can collaborate in building their search results:

“This open search platform enables 3rd parties to build and present the next generation of search results. There are a number of layers and capabilities that we have built into the platform, but our intent is clear — present users with richer, more useful search results so that they can complete their tasks more efficiently and get from “to do” to “done.”

Because the platform is open it gives all Web site owners — big or small — an opportunity to present more useful information on the Yahoo! Search page as compared to what is presented on other search engines. Site owners will be able to provide all types of additional information about their site directly to Yahoo! Search. So instead of a simple title, abstract and URL, for the first time users will see rich results that incorporate the massive amount of data buried in
websites — ratings and reviews, images, deep links, and all kinds of other useful data — directly on the Yahoo! Search results page.

We believe that combining a free, open platform with structured, semantic content from across the Web is a clear win for all parties involved — site owners, Yahoo! and most importantly, our users.”

Let’s wait and see.
You can see the details at the following links:

And MSN simply doesn’t get it. They’re trying to apply the same centralized tactics that made them so successful in the PC market, but it is evident that they won’t work on the Internet.

By combining both companies you’ll only get a much more cumbersome monster and the Internet is about just the opposite, decentralization and agility.

Many people attribute the initial success of Google to the quality of the search results. That is true today but it wasn’t so in the beginning, when they started to draw users from other search engines. The main reason why most of the people made Google their home page is because it was simple. No advertising or fancy graphics, just a search box and a menu where the rest of services are listed as text links on a page that loads very fast.

By trying to push users into using your services and bloating your front page with advertising, you are actually driving them away.
To be fair Yahoo! does have a version of their home page that is designed that way:
http://search.yahoo.com/
Had they made it to be their front page many years ago, they’d still be game.

Screenshot of Yahoo front pageAnother feature that convinced me to switch to Google many years ago was that they give you the opportunity to try your search terms on different search engines with just one click, with the:
“Try your search on Yahoo, Ask, AllTheWeb, Live, Lycos, Technorati, Feedster, Wikipedia, Bloglines, Altavista, A9″
that used to appear on every search results page.
Same functionality can be achieved nowadays by installing CustomizeGoogle, a highly recommended Firefox extension that enhances Google search results by adding extra information (like the above mentioned links to Yahoo, Ask.com, MSN, etc.) with the added plus of enabling you to remove unwanted information (like ads and spam).

CustomizeGoogle 2 min introduction movie

By creating an extremely simple entrance to an environment open to everybody, including their most direct competitors, they have succeeded in being the most popular home page on the Internet.
The KISS approach (“Keep It Simple, Stupid”) is what they used.
Keep It Simple. And Open. Stupid.


  • Occam’s razor: “entities must not be multiplied beyond necessity”
  • Albert Einstein: “everything should be made as simple as possible, but no simpler”
  • Leonardo Da Vinci: “Simplicity is the ultimate sophistication”
  • Antoine de Saint Exupéry: “It seems that perfection is reached not when there is nothing left to add, but when there is nothing left to take away”

February 5, 2008

Keyloggers protection

This is one of a series of articles I posted for magiKomputer.
You can view the original version at:
* * http://magikomputer.blogspot.com/2008/02/keyloggers-protection.html

Keylogging works by recording the keystrokes you type on the keyboard to a log file that can be transmitted to a third party. Keyloggers can capture user names, passwords, account numbers, social security numbers or any other confidential information that you type using your keyboard.

There are two types of Keystroke loggers:

  • Hardware key loggers are devices that are attached to the keyboard cable or installed inside the keyboard. There are commercially available products of this kind, even dedicated keyboards with key logging functionality.
  • Software key loggers are usually simple programs that can capture the keystrokes the user is typing, They can also record mouse clicks, files opened and closed, sites visited on the Internet, etc. A more advanced type of key loggers can also capture text from windows and make screenshots of what displayed on the screen.

While writing keylogging programs is simple, a different matter is installing it inside the victim’s computer without getting caught and downloading the data that has been logged without being traced.

The best protection against keyloggers is to avoid them in the first place.
A few golden rules:

  • Use a Firewall
  • Use an Anti-virus program
  • Use an Anti-spyware program
  • Never click on links sent by unknown people and be very careful of the known ones since their address might be faked. If in doubt, check the e-mail headers.
  • Never execute attachments on e-mails that are executable files (EXE, COM, SCR, etc). No exceptions here.
  • Never execute programs from the Internet that lack a security certificate. Except from Microsoft update and very few others, there should be no reason for executing any programs from the web.
  • Run a virus and spyware check on ALL files that come from external sources (USB pen, DVDs, etc)

Additional measures that can be taken are:
Monitoring what programs are running on your computer
Monitor your network whenever an application attempts to make a network connection.
Use an automatic form filler programs that prevent keylogging since they’re not using the keyboard.

There are commercially available anti-keyloggers, but if you’re looking for a free alternative try Spybot Search & Destroy, a freeware tool that does a pretty decent job at detecting all kinds of spyware:

Windows Defender, a free program that helps protect your computer against pop-ups, slow performance, and security threats caused by spyware: http://www.microsoft.com/athome/security/spyware/software/default.mspx

The Sysinternals web site hosts several utilities to help you manage, troubleshoot and diagnose Windows systems and applications.

File & Disk File and Disk Utilities
Utilities for viewing and monitoring file and disk access and usage.
Networking Networking Utilities
Networking tools that range from connection monitors to resource security analyzers.
Process Process Utilities
Utilities for looking under the hood to see what processes are doing and the resources they are consuming.
Security Security Utilities
Security configuration and management utilities, including rootkit and spyware hunting programs.
System System Information
Utilities for looking at system resource usage and configuration.
Miscellaneous Miscellaneous Utilities
A collection of diverse utilities that includes a screen saver, presentation aid, and debugging tool.

In this article:
http://www.lazybit.com/index.php/a/2007/03/01/free_keylogger_protection
Alex provides some free and valuable advice about keylogging protection such as using the on-screen keyboard available in W2000 and XP that can be launched by executing “osk” or the technique of mouse highlighting and overwriting.

Or you can also download Click-N-Type virtual keyboard free from:
http://www.lakefolks.org/cnt/

Click for other popular layouts

Also worth reading is Wikipedia’s article on Keystroke logging:
http://en.wikipedia.org/wiki/Keystroke_logging

And a simple trick to fool keyloggers:
http://cups.cs.cmu.edu/soups/2006/posters/herley-poster_abstract.pdf

November 14, 2005

The Curse of the Amiga

This article was first posted by my alter-ego laparanoia at the magiKomputer‘s Blog.
You can view the original version at:
* http://magikomputer.blogspot.com/2005/11/curse-of-amiga.html

Amiga Survivor DrawingIs the Amiga Dead, Yet?
Not Yet.

Is it cursed?
No doubt.

Even me, as I was writing this post, had Firefox crashed for the first time ever and lost about an hour’s work. Previously I had tried to post from Elicit and Zoundry with similar results. In more than 3 years blogging I had NEVER experienced anything even remotely similar. When I restarted, my right button search function had vanished, and all those circumstances put together have made this post the one that has taken more effort to create by far. But, you see, I am an obstinate bastard, specially when it comes to something I’ve spent so many years working and playing with (or was rather the other way around) and that is so close to my heart as the Amiga.

I have been a fanatic user of the Amiga from 1991 until the turn of the millennium, and I still think it was the best machine mankind has ever created. What has happened to this computer is a real techno-tragedy and I am sure it has altered the course of History, and not for the Good.

I haven’t tried the latest hardware and software, but here is an excellent review of Jeremy Reimer, who bought an AmigaOne Micro with OS4 on November 2004:

The Micro-AmigaOne and Amiga OS4 Developer Prerelease
Jay Miner started the Amiga Inc. computer company in 1982 before Commodore bought them out.
The Amiga computer was first commercialized released in 1985 by Commodore, that eventually went bankrupt in April 1994.
Commodore was bought at liquidation by Escom AG, who had no real interest in the Amiga. Escom itself went bankrupt a few years later, and the Amiga was briefly bought out by set-top manufacturer VISCorp, before they too filed for liquidation.
Its new owner was Gateway Computers, who were only interested in Commodore’s old patent portfolio. When it became increasingly clear that Gateway was never going to do anything with the Amiga, a consortium of investors calling themselves Amino Development bought out the rights to the Amiga hardware and OS in 1999.
The new AmigaOne motherboards were first released in 2002, but there was no OS to go with them, so they shipped with Debian PPC Linux. After an agonizing 18-month wait, the first Developer Prerelease CD of OS4 was shipped to AmigaOne owners worldwide.

AmigaOne OS4
OS4 boots remarkably quickly. From a cold boot, including waiting for power up, BIOS messages, straight to a usable desktop took slightly over 30 seconds. A “warm boot,” which bypasses the BIOS start-up and merely reloads the operating system, takes slightly over 10 seconds.

One feature of the original custom Amiga graphics chips was that you could “pull” down screens with the mouse to see screens that were behind them. This feature, called “draggable screens,” was never duplicated by any graphics card manufacturer since, so sadly it is not available on the AmigaOne.

A cold boot, including power up, BIOS messages, takes less than what it takes you to get accommodated in your chair. Compare that to any Windows/Mac OS start-up. They usually give me enough time to go and make coffee (Mac OSX is not that sluggish, to be honest).
I am sorry to hear that there is no “draggable screens“.
Another cool feature was the ability of clicking on several menu items at once (holding right-side button and clicking with the left), and get the commands batch-processed at once.

Many people, upon reading the hardware specs of the Micro Amiga One, will feel that the performance (800MHz PowerPC 750FX, SDR RAM) is far below modern gear. This is true to a certain extent, but it does not give the whole picture. AmigaOS was originally written for a 7.14 MHz 68000, and the last Classic version released by Commodore, 3.1, was optimized for a 12 MHz 68020 platform. According to Hyperion, over 90% of the OS code has been converted from 68k to PPC, and the only code yet to be translated (serial port code, AREXX macro routines), does not typically impact on performance.
Because the OS is so small (About 60MB on disk for a complete install), it fits very nicely in 256MB of RAM, with room for several applications, most of which have a similarly small memory footprint. This means that you can run the OS and multitask between several applications without ever swapping to the disk.

I have created and run multimedia presentations for TV stations on as little as an Amiga 500, 1Mb RAM, 720 Kb floppy, no Hard Drive. Gosh, I miss Scala so much…

In speeches around the world, Alan Redhouse of Eyetech always opens by saying that everyone always asks them: “Why are you doing this?” And the answer he gives, with a smile, is “We don’t know!” There is an infectious enthusiasm among Amiga users…

Infectious enthusiasm defines the feeling of Amiga users at that time.
As of today, if you visit Amiga’s Headquarters (http://www.amiga.com/) you’ll be presented with the latest technology in … Jackpots!!!?
It has broken my heart.

Better visit this one: http://www.amiga.org/

Is there a future for the Amiga?
Some people seem to think so:

http://www.amigaworld.net/
http://www.amitopia.no/

Jeremy Reimer has a website full of undiscovered gems at:
http://www.pegasus3d.com/jer_main.html
where among other things he promotes StudlyOS, as the Only Operating System You Will Ever Need.
I wish I had the time to try it out.
I liked the Amigan comment on it, though:

“StudlyOS sucks!!!1111 Y00 think itz c00l but your rong!!!!!11111 I Cant run it on my Am1ga so what yoos is it????/ My Am1ga beats yor peecee anyday!!!!!! !!!11111111 Peecee even with StudlyOS cant beet Amiga because Amiga rules!!!! Amiga iz better because it is Amiga!!!1111 Nothing else is Amiga!!!11111” – B1FF


The Amiga Boing Ball is a mythical object in the computer industry. It was created as an example of the machines ability. The demo showed a red and white ball bouncing around the screen and interacting with the environment- it bounced off the walls, spun, while multitasking in the background.

That demo displaying smooth animation in full colorwhen other computers were only just managing color display, helped sell over a million Amigas at a time when a computer was a synonym of science fiction.

%d bloggers like this: