12.21.2004
Personal Disaster Recovery for the Masses
At least once a month, I back up all my important data and documentation, burn it to some CDs and place it in one of those lockable fire-proof boxes. I include my bookmarks and any software that I’ve downloaded and installed recently, like some of the open source projects I’m trying out. If I’m working on a specific project from home, I copy the data to multiple machines daily and usually burn it to CD every couple of days.
I also put my address book in the fire-proof box in various file formats as well as actual print-outs. This includes all my business contacts, credit card and utility companies, insurance companies, etc. If a fire ever burns my place down, I just need to find that box in the rubble, open it up and start making calls. I think most people could benefit from this regardless of whether or not they’re in IT.
My software documentation usually includes step-by-step instructions for getting back up to speed should disaster strike. I try to document everything as I use it the first time so that documentation isn’t a separate chore. I still have a lot of work to do here but I am definitely making progress.
Well, this weekend, my Fedora linux box up and died. The power supply keeled over (at the ripe old age of 10 years). When the power failed it corrupted the hard drives because it wouldn’t let me boot back into linux. So, I had to replace the power supply and then just re-install linux. I had documentation on all the options to choose during install. I also had docs on installing/configuring Apache, mySQL, PHP, Perl and its modules.
Now, I am far from perfect and there were a few instances where I had to make educated guesses on setups but I was sure to add those into the docs as I went. There’s a kind of recursive nature to disaster recovery. The more disasters you have, the better you get at it I suppose.
The whole install and configuring process took me only about 5 hours. This could have been faster but my linux box is a Frankenstein, with most parts being pretty old and slow. I was pretty impressed that the whole process only killed one morning.
12.20.2004
There is a vulnerability with LiveUpdate. Please use LiveUpdate to download a fix.
I hate security updates like this.
From: http://www.securityfocus.com/bid/11873/discussion/
Symantec Windows LiveUpdate is reported prone to a local privilege escalation vulnerability. This issue can allow a local unprivileged attacker to gain administrative privileges on a vulnerable computer. It is reported that this issue only presents itself during an interactive LiveUpdate session. A local attacker may influence the LiveUpdate GUI Internet options configuration functionality in a manner that grants them elevated privileges. This issue affects Windows LiveUpdate on computers running retail versions of Symantec products and Symantec AntiVirus for Handhelds Corporate Edition v3.0.
And the solution: http://www.securityfocus.com/bid/11873/solution/
Symantec has released Windows LiveUpdate 2.5 to address this issue. This version can be automatically installed on vulnerable systems by running LiveUpdate. It is also available for download from the following location: http://www.symantec.com/techsupp/files/lu/lu.html
12.17.2004
.Net MVC framework for winforms and ASP ?
My present needs have me creating a lightweight MVC framework for .NET. I need to have a system whereby I can create the core functionality one time and port the solution to either a desktop Winforms solution or an ASP.NET web-based solution. It seems to me that loads of people would want this same functionality. And yet, I haven’t been able to find an MVC framework that can do this except for the UIP Application block. Not only is that block created and supported by MS (strike one), but it is needlessly complicated (strike two). The user community surrounding it seems to also be non-existent (strike three).
There are a few MVC frameworks for ASP solutions on sourceforge and/or freshmeat but nothing with the ability to go from Winforms to ASP and back again. So, when I’m done with my own framework, I’ll probably open it up and slap it on sourceforge as a new project. Of course if I’m missing something in my search here, please drop me a line.
12.13.2004
Credit Card fraud prevention and algorithm design at Amex
Nothing is a bigger pain in the butt then pumping gas into your car in the middle of a snowstorm. Nothing that is… except for when the gas pump turns down your Amex credit card and makes you scramble across icy pavement to pay the attendant inside. Luckily, my Amex card only started failing on my return trip as I visited family this weekend.
When I got home, I had several half-human, half-automated phone messages from Amex mentioning that my account was flagged for possible fraudulent charges.
The human read my name aloud and an automated voice gave the actual warning and 800 number to call. It was funny to me at the time that the men and women speaking my name completely butchered it. They had one job to do and they did it poorly. The automated voice performed admirably.
Upon phoning Amex, I went over my recent charges with a representative. I’m pretty anal about receipts and so I had almost every recent receipt in my wallet. All the charges he read off looked good to me so he removed the flag from my account but not before I questioned him on why the account was flagged in the first place.
I originally thought perhaps my card had been flagged for the recent onslaught of purchases I’ve put on it. I’ve fallen in love with the Amex point system and have been using my Amex card instead of cash for my holiday purchases. I like giving gifts so needless to say, that credit card is practically smoking in my wallet these days.
But it turns out that Amex flagged my account for a combination of reasons:
- There were several small purchases in one state on Friday in the AM and several small purchases on Friday in the PM in another state.
- There were 4 purchases at gas stations in a 24 hour period.
Basically, since I was traveling by car through a few states and buying Christmas presents along the way, Amex thought perhaps my card was stolen. That's a reasonable assumption. What was in actuality just someone driving to a relative's house, buying gas and a few small gifts at outlet stores along the way could well have been a rowdy gang of credit card bandits souping up their post-apocalyptic Mad-Max vehicle with new electronic gadgets.
The Amex rep couldn’t tell me whether or not either of these reasons alone would have caused the flag to be set on my account. But I’m kind of glad that they have some kind of system in place that is at least looking for irregular patterns of purchasing. It got me to thinking about what kind of business rules their “purchase pattern algorithm” would require.
Right off the bat, they would need some kind of purchasing history on you. I know for a fact that if you only make small purchases on your Amex and then make a huge purchase, that they’ll flag you for that. But does that mean there’s no fraud prevention going on for your account in the first few months of new customers? Probably not, but I wonder.
Amex would also need to keep track of the geographic locations of purchases. For instance if a purchase is made in this state now and another far away state 5 minutes from now, that’s probably fraud. But how do they get/maintain that geographic information? What about for web purchases? Is that order recorded with the geographic location of the purchaser or the vendor?
Amex also has to have date information built into their system. They didn’t flag me for lots of purchases near the holidays, they flagged me for other reasons. This means their system expects credit card activity to ramp up in November and December. That’s a reasonable assumption. But to me, it also immediately says that November and December are the best months to commit credit card fraud.
I’m not building any kind of system right now that would require this kind of algorithm design but it’s interesting to be able to look at one in use that affects us on a daily basis. Or at least as often as we use our credit cards. Next time, we'll look at algorithm design for traffic lights and how they always seem to turn red when I approach.
12.10.2004
The BALCO steroid scandal, Enron, Martha Stewart
Baseball, like any professional sport in the U.S. is big business. Millions and millions of dollars change hands every year due to baseball. The fact that some players broke the law to gain a competitive edge doesn’t surprise me at all. By breaking the law and taking steroids, they upped their statistics, which garnered them a bigger paycheck. Their bosses probably turned a blind-eye to this too because the better the player’s statistics, the more people sit in the stands and buy apparel which in turn raises everyone’s salary in the organization.
It seems pretty cut and dried to me. They broke the law to gain a competitve advantage. That’s what folks at Enron did. That’s what Martha Stewart did. That’s what numerous other company personnel are doing right now in many other companies; they just haven’t been caught yet.
Until big business learns some morality, I fear we will have to weather quite a few of these storms. I wonder these days if capitalism and morality can even coexist at all.
12.07.2004
Studies: Lost sleep equals gained weight
I've always said that at a certain point, sleep and food are interchangeable. I learned this back in my first run through college during the many sleepless nights I spent studying. I also find the same thing happens when I'm up late programming. If I'm missing a few hours sleep, I try to replace the missed hours of sleep with a full meal. Now, there's a study that proves me right.
Losing sleep can raise levels of hormones linked with appetite and eating behavior, the researchers said... we are finding that people tend to replace reduced sleep with added calories.
So, keep that in mind the next time you're burning the midnight oil.
12.05.2004
It's a bunch of BCS
I like football. Not-so-much college football because of the constant bickering involved in "who is the best" type discussions. Although, I wish I had paid more attention in my statistics classes in school. I’m trying to understand this whole college football snafu in determining who is actually the number one college team.
I’ve come to the conclusion based on several years of programming experience that there is no mathematical algorithm that can define who is the national champion in college football.
The BCS is quite simply broken beyond repair. It is therefore quite an interesting study in corporate application development.
Looking at the BCS standings from a mathematical standpoint has me just stumped. I tried reading through Colley’s Matrix info but my eyes glazed over. Colley's rankings make up essentially 1/7 of the BCS calculation. In retrospect, I probably started with the wrong part of the BCS calculation, as there seem to be a few simpler methods out there being used.
But Colley makes a good point before he gets to all the mathematical jargon. There are 117 division I-A college football teams that each play about 11 games. Some of them play more than 11. This upsets the rankings. Some of these games may even be against division I-AA teams. This also upsets the rankings. Therefore, the opponents of each team for a season cannot be a statistically viable sample representation of the teams competing since teams compete on multiple skill levels (I-A, I-AA) and for multiple durations. Put another way:
There’s too many teams in college football with too much disparity between them and they don’t all play the same number of games.
The solution? Well, a lot of people want to get rid of the BCS computerized rankings. Some people want to change them to add in margin of victory. Some people want to add an 8 team playoff like in I-AA. (The problem there is how to decide which 8 teams though, so the BCS will again be involved).
This becomes an interesting business problem because this is not a problem that can be solved by throwing more data at it. This is a classic (but wrong) business system approach:
“We don’t think this data is always right. So, we’re going to fix it by throwing more data into it.”
It won’t help if they add certain items like “margin of victory” into the calculations. Any programmer can almost immediately come to the underlying conclusion that there is failure built into the system. The architecture, if you will, for determining the college football national champion will almost never provide a definitive champion. Quite simply, the BCS system is broken and needs to be re-built from the ground up. Much like a malfunctioning application, sometimes you have to know when continual debugging isn’t helping, and be confident enough to design and build a new system.
The solution to all of this BCS nonsense is three-fold.
First, they need to get rid of half the teams in division I-A football. Half of those teams are terrible anyway. Army. Vanderbilt. Western Michigan. Good gravy, my mom and her sewing circle could beat those teams.
Second, every team in the country needs to be in a conference and that conference needs to have a playoff system to determine the winner. Yes, this means you Navy and Notre Dame (if you are ever good again).
Third, you can’t let division I-A teams play division I-AA teams. It screws everything up. The math goes out the window and it is like comparing apples to pineapples.
Or of course, we could just say “who cares” and let those boys just earn a decent education instead of spending all this time, money and effort on trying to weed out the winners from the losers.
But who the hell wants to do that? It’s so un-American.