Thursday, October 29, 2009

Man I gotta lay off the [SAS] dynamic room clearing. It's like bam, bam, bam, bam bam bam - 2 story building with 30-40 contacts clear in the blink of an eye.

I feel like doing back flips, lol

Friday, October 23, 2009

So, where have I been lately?

I haven't been keeping up with my journal lately, things have been hopping quite a lot recently.

Wednesday night, my mother had to be taken to the hospital. I spent the night on the couch, trying to keep the dogs sane; Willow & Coco have been bark happy! Thursday was basically spent on a mixture of travel, hospital, and chores. Got to spend some time with my brother after we visited our mother, I was there about three times during the day. Another night on the couch lol, and Coco doesn't exactly like to *share* that much!!! For years my mothers thought that she may be diabetic, so finally being in a doctors care for, I guess the first time in 21 years; ma was diagnosed with type 2 diabetes. Friday saw the conclusion of testing, and it appears the main problem is Diverticulitis. Who would've thought having a high(er) fiber diet in your 30s and 40s was going to be important twenty to thirty years later? So far, things seem to be pretty good. Ma should be coming home tomorrow, eh today, or at the latest Sunday.


For what hours I haven't been in a hospital room, I've basically been home with the dogs. I don't mind being alone as much as I would've thought, but I am really glad to hear that ma is coming home soon. It's too quite without her, and the dogs are not used to just having me around the house.


Two things that I have learned, is if I ever had to live alone, two fundamentally important things are a good radio (that gets 94.9) and gloves ^_^. Actually, I think this was the first time in my life, that I really had to wash dishes—I'm not about to have her come home to a kitchen full of crap. For some reason, without my mothers constant interruptions the day time has passed much slower then I'm accustomed to; so rather then quickly flying by days and slow nights, things are on a more natural pace. It will be nice to be able to sleep in my own bed again as well, rather then taking the nights on the couch. While Willow will hog my bed at night, Coco doesn't care much and won't often venture that far away from the living room, unless she wants to hide in my closet lol (oft' during thunderstorms).


Getting a little drowsy now, and I should be getting to sleep, only GOD knows when the phone is going to start ringing in the morning. Heh, and actually that is an interesting thing that coems to mind now that I write this; although I've probably spent as much time on the phone in the past three days, then I have in the past three years. One thing I have noticed is that while ma has been in the hospital, there hasn't been tons of calls from creditors all freaking day; there's only been two or three at the most. Anything else would have to of come through when I was visiting at the hospital.


Spooky, ain't it?

Monday, October 19, 2009

EPI, the facts.

Since it has been brought up recently, I've decided to air out the facts about the "Top secret community project" [sic] known as EPI. It is also my request, and being that I am an Admin here, one that I will enforce -- that any comments about this go into a separate thread. First to start one has my blessings for collecting comments.


[color=blue]This post will be locked along with the thread; everyone shall respect this. A copy will also be retained else wheres.[/color]



[i][u][b]
Project Status?
[/b][/i][/u]

Stalled (i.e. postponed until further notice), because of "Real life" taking priority for the developer.


4 people were involved in the project at it's height, and provisions made to approach 2 or 3 others at a later date. In answer to some peoples questions, Yes Graedus and Myself were involved; the "One guy" tasked with all the coding tasks was none other then myself. I enjoyed it greatly while off-work hours permitted.



[i][u][b]
EPI, what the heck is that?
[/i][/u][/b]

EPI is short for "Encapsulated Pacakge Installer". It is a method for integrating existing FreeBSD software management systems with an easier to use means for distributing third party packages (like PBI) and the ability to integrate with any desktop environment or user interface system.


[i][u][b]
EPI Project Goals?
[/i][/u][/b]


[list=1]
[*]Remove the need for the user to deal with the "Package, Port, or PBI" question.
[*]Make creating "EPI" files a snap with a minimal of fuss on the maintainer, and trivial to automate the build (but not like PBIs build system)
[*]Allow support of console, and desktop environment agnostic installation without inconvenient to maintainers.
[*]Create something that could be owned, managed, and operated by the community as a whole; not the PC-BSD developers: who proved to be incompetent and incapable with their miss management of PBI, at least in our (4 sets of) eyes.[/list]

It was intended that once the system was developed, that it would be further refined by the community and someday replace the PBI system, becoming the [i]de facto[/i] standard way of managing software on PC-BSD. Like wise, it was also intended that once the EPI system matured, it would become the means by which PC-BSD itself would manage system software - instead mucking with peoples ports.


While EPI is not my concept of what kind of package management system PC-BSD needs, it is our concept of what PBI should have been in the first place, and what [i]PBIs[/i] could have become if properly managed by PC-BSDs developers.



[i][u][b]
How does it (EPI) work?
[/i][/u][/b]


First a FreeBSD port is created for the given program; this has been done for most software that is worth running on FreeBSD and should be done for anything else. There are is approaching 21,000 programs in the ports tree, so much of that is done for us all.


Second a maintainer writes out a description of the particulars. This basically amounts to stating:

[list]
[*]What port(s) does this EPI provide?
[*]What EPI does this EPI depend on?
[*]Who are you?
[*]Special Needs Hook[/list]


By stating what ports to be provided, for example "www/firefox35". The build system would then automate the process from there on without maintainer intervention. Firefox would be fetched, built, and stripped down to the minimal required dependencies. This would all be done inside of a private jail on a build server, where in there is nothing else to interfere with the process (that is, inside the jail).


The "Firefox EPI" would depend on several other EPI, in this case it would depend on the following ones: EPI Core Services (the EPI system tools), X Windows System, and GTK+ Runtime. Because of this issues relating to X and GTK dependencies are removed from the Firefox EPI, creating a *much* smaller download and more manageable interface for people who just want to install Firefox, and have it just freaking work without trouble! Because of this design decision, unlike with PBI; the issue of dependencies are automated and can be checked. PBI does not support that. An advantage of EPIs way of doing it, results in ease of maintenance, more effective use of disk space, and more effective integration with FreeBSD. Another great perk is it makes writing things like Flash or MPlayer plugin EPIs much less painful then with PBIs.


For security reasons the EPI file would be digitally signed during the creation process. Every maintainer has their own "Key" that is used for signing EPIs that they create. This allows a package to be traced back to its creator, who must manage their reputation within a "Web of Trust" distribution model.


In case of "Special Needs", there is a special Encapsulated Package Installation Language, or "EPIL" for short. EPIL is a simple declaratory way of scripting the installation process. This is analogous to the various 'PBI.*.sh' Bourne Shell scripts used in PBI. [b]Unlike PBI, the EPIL system is designed for use by non-programmers and is rarely required[/b]. Every thing that can be done for you, will be done for you, in order to create the best possible encapsulation of work, minimize your hardship, and make life easier on both users and maintainers. By contrast creating a PBI requires an understanding of UNIX shell scripting and programming, and employs a flawed Application Programming Interface, which usually results in poorly created PBI and fewer maintainers. EPI solves this problem by applying sound software engineering practices, and makes creating an EPI a snap. [b]Under normal conditions the maintainer has to never write an EPIL script, and even then it is less trouble then writing a forum post[/b]. The maintainer has no need to worry whether the installation is text, graphical, attended, or unattended; all standard needs are done by magic; that is the massive opposite of traditional PBIs.



After creation, the EPI file makes its way to a sacred repository for evaluation; a way of downloading it is provided to the author. Trained people inspect the maintainer serviceable parts, i.e. no hidden delete all files on your system kind of bug. Both those trained folk and regular but trusted people then test the individual EPI to make sure it works as advertised. A simple check list is used to note that down correctly, reports shall be publically posted and the maintainer notified.


If no show stoppers were found and the maintainer is in good standing with the community authority, their package is then hosted for public download in accordance whatever community policy is deemed appropriate. The community website (think like the PBI Directory) would then host a download link and all necessary data, so that end users may download the created EPI.


If enough end users complain or have problems with EPIs created by a specific maintainer, that maintainers rights to use the community systems will be temporarily revoked (permanently if need be), and the maintainers "Key" will become untrusted by the EPI Community Authority - thus invalidating our trust in that maintainers EPIs safety for general consumption; individual end users have the ability to ignore the communities decision and install those EPI anyway.



A end user then downloads the EPI file, it could be from the Community Authorities website, from a friend, or even direct from the third parties website! (E.g. Adobe, KDE, Gnome, etc.)


The end user chooses the installation method: graphical or textual.


To install via graphical mode, simply double click the .epi file on your desktop and it will begin a graphical installation wizard. The wizard run is user servicable with a default based on your environment; i.e. Gnome & Xfce users get a GTK+ based wizard, KDE users get a Qt based wizard. Users and Developers could create their own wizards.

To install via textual mode, simply run the installation program in the shell:

[code]
sample# epi-install ./Mozilla_Firefox-3.5.3-i386.epi
[/code]


Both methods invoke the same program, but with different arguments.



The epi-install program updates its understanding of "Trusted Keys" published by the Community Authority, or any other source the user chooses to trust; the user can even skip this step.


Assuming all has went well, epi-install then unpacks Firefox accordingly, verifies the maintainers signature and the packages integrity. If found, the compiled EPIL script is run - the user can choose not to run the script. Normally this is a moot point, because there shouldn't be any script needed. Of course, the EPIs installation is recorded in a database.


What the user sees depends on how it was run. In text mode they get a console friendly way of doing the installation. In graphical mode, they get a GUI install wizard like PBI. Environment variables and command line switches are provided override behaviour - for example, choosing to run the Qt wizard under Gnome. All this is so easy because the EPI maintainer was never arsed with dealing with it, it was done automatically for them.



Firefox is now installed, the end user can run it from its locatiion in $EPI_ROOT. The default $EPI_ROOT would likely be /usr/epi if adapted by PC-BSD. When "Installed as a third party product" on FreeBSD or PC-BSD, the default $EPI_ROOT would likely be /usr/local/epi.


Our way of doing things would enable both shell users and desktop users a fairly painless way of accessing firefox, without favoritism to KDE or Gnome.


[i][u][b]
Ok, so how does this relate to PBI?
[/i][/u][/b]


PBIs are managed by the PC-BSD developers, and the people trusted with watching over the safety of end-users are either corrupt, derelict, or incompetent. [i]EPI[/i], would instead be placed into community hands, so that no one person or entity has total control.


As a format, how they work is very different. An EPI is a compressed archive containing program files and meta data; an external program located on the users machine is used to handle the installation procedure. This is how many package management systems designed for UNIX work, Microsoft's own Windows Installer is not to far off either. APT, DPKG, RPM, and FreeBSD Packages work this way as well. The PBI format on the other hand, is a self extracting executable with an embedded archive containing additional meta data, program files, and an embedded installation wizard. The PBI system is dependant upon the FreeBSD version, KDE version, and the presence of system programs -- PBI is written in C++ but done like a shell script. Internally, PBI is both ugly and non-unix like. [i]EPI[/i] instead provides a more platform and version independent way of doing things.


The format of PBI files, how they work, what they do, and how they are managed by the system is generally undocumented. [i]EPI[/i] would provide a totally documented system, making it easy for developers, system administrators, end users, and businesses. Heck, you could even create your own EPI system that is totally compatible - you can't do that with PBI, unless you read a lot of bad code and kuddle up to a nasty API.


In order to make a PBI, you need to know quite a bit about shell script and do a lot of leg work that should be done for you automatically; end result is most PBI install scripts are bad, even Kris Moore's are shotty. [url=http://sas-spidey01.livejournal.com/389068.html]I found an old PBI script that I wrote a while back[/url], that is done 'properly'.


Because [i]EPI[/i] files are throughly checked at install, it tries to ensure that what you download is exactly what the maintainer created. By contrast the PBI file you download is not guaranteed to be what the maintainer created, there is no safe guard against tampering with the files on the mirror - the user is on their own without so much as a checksum of the actual .pbi file that was sent to the mirror!



[i]EPI[/i] has a simple and documented dependency model. If you don't have GTK+ Runtime EPI installed, Firefox EPI should warn you. The EPI Core Services provides as a contract, a set of ultra-common dependencies used by many programs. This reduces disk space waste and allows EPI to work across differing versions more easily. Our decisions would have created a more robust system then PBI, while minimizing dependency hell to the same level. The way PBI does things creates more work on the maintainers and cause more interoperability/integration problems that the PC-BSD developers simply don't care to address. PBI with hidden or undocumented dependencies are also not uncommon, because the only 'standard' of what the PBI can depend on is the release notes for PC-BSD X.Y.Z and the PBI "Guidlines" as they have become, which used to be rules that were just often thrown in the trash can.



[i][u][b]
OK, OK, enough already, but who the heck are you Terry?
[/i][/u][/b]


I am a person who has used computers almost since he learned how to walk. Someone that loves software engineering, and shares his grandfathers work ethics, that if it has your name on it, then it has to be GOOD.


Several years ago, I encountered UNIX and embraced it with an open heart.


The first date with PC-BSD was 1.0RC1, a release candidate shipped in 2005. I have since various versions of PC-BSD on home servers, desktops, and laptops. During the 7.0Alpha testing cycle, before KDE4 had even entered the picture, I had made the decision to transition my personal-workstation to FreeBSD, and never looked back.

As a part of this forum, I joined to see if I could help people and learn a thing or two from my superiors. Even after they left, I remained, eventually becoming a Moderator and later an Forum Administrator at the request of Kris Moore - likely because I shouted loudest of all about the spam problems. My activity on the forums over the past ~two years has rubber banded with my work schedule and the rest of living.


For a time, I created PBIs of programs that interested me, such as Blackbox, which was the first alternative to KDE to be had in PBI form. After a while, flaws in the PBI design and the developers disregard for their own rules caused me to "Give up" on creating and maintaining PBIs. I have seen everything to be seen with PBI, down even to the point of Charles breaking the PBI rules, PBI developers publishing their own PBI without testing, Kris Moore changing the rules after breaking them for Win4BSD, Kris changing them back over community outcry (and increasing lousy PBIs lol). Throughout it all, the process of getting PBIs published has made me sicker then watching a corrupt Government at work.


I've generally kept a safe distance from PC-BSD development, this is why I never became involved with the development team; their actions over the years also do not give me desire to "Volunteer" my services.



When the question that PBIs could be created automatically was brought up many moons ago, it was shot down, most strongly by none other then the (then) host of the PBI Directory website (Charles). I supported the idea and it was generally held in contempt as something "Impossible", only later to see it become a part of PC-BSD. Exploring it ahead of everyone else, was actually how I learned much about shell scripting.



My skill set includes C, C++, Java, Perl, PHP, Python, and all the way to parts of X86 assembly, Scheme, and Common Lisp. More task specific issues such as SQL, HTML, LaTeX, sh, bash, batch/cmd, AWK, SED, and indeed, even [i]ed scripting[/i], are apart of my abilities.


I think no one will debate that I know a thing or two about that which I speak.

A look at a *good* PBI install script from 2007.

In looking around my ~/Projects folder, I found an old PBI install script I wrote a couple years back. When I was working on a TexLive PBI (only me and Oko were interested), I wrote a very robust script. Most PBI that use install scripts should be nearly like this but normally, they only handle the GUI case and screw you if you use text mode or have special need.


This script was written to the published API at the time, which basically amounted to $INSTALLMODE being set to tell you if you were in a GUI or not; a variable to tell you what name your PBI would have inside /Programs. This was an improvement over using "$1" everywhere, as was required in previous versions of the PBI API.

Here is my old script:

#!/bin/sh

# the year of this release, e.g. 2007, 2008 e.t.c. -> This should be equal to
# the Program Version we set in PBC.
YEAR="2007"

# the size of all our installed files in $PBI_BASE/texlive/$YEAR/
TEXSIZE="1.1GB" # as a string

# compat check.. should be unecessary but I don't trust PC-BSD. Since users can
# now override our install location and documentation does not talk to us about
# coping with it not being /Programs/$PROGDIR/, abort install rather then risk
# performing undefined/undocumented behavior should /Programs/$PROGDIR be
# overrided.

if [ -d "/usr/Programs" ]; then
    PBI_BASE="/usr/Programs"
elif [ -d "/Programs" ]; then
    PBI_BASE="/Programs"
elif [ -d "/usr/local/MyPrograms" ]; then
    PBI_BASE="/usr/local/MyPrograms"
else
    if [ $INSTALLMODE = "GUI" ]; then
        kdialog --sorry "Can't find PBI Installation Directory... aborting"
    else
        echo "Can't find PBI Installation Directory... aborting"
    fi
    exit 255
fi

# set the path
MY_TEX_PATH="$PBI_BASE/texlive/$YEAR/bin/i386-freebsd/"


# XXX check if we are already installed or improperly installed

if [ -d "$PBI_BASE/texlive/$YEAR" ]; then
    if [ $INSTALLMODE = "GUI" ]; then
        kdialog --sorry "$PROGDIR appears to be already installed, aborting"
    else
        echo "$PROGDIR appears to be already installed, aborting"
    fi
    exit 255
fi

# give the user a chance to abort the installation
if [ $INSTALLMODE = "GUI" ]; then 
    kdialog --warningyesno \
    "The installation requires approximatly $TEXSIZE of disk space Continue?"
    if [ $? -gt 0 ]; then
        exit 1
    fi
else
    echo "The installation requires approximatly $TEXSIZE of disk space"
    echo -n "Continue?[yes/no]"; read GO
    echo $GO | grep -i "no"
    if [ $? = 0 ]; then
        exit 1
    fi
fi


# Do installation
echo 'MSG: Setting up TeX Live directory'
mkdir -p $PBI_BASE/texlive/$YEAR
ln -s $PBI_BASE/texlive/$YEAR $PBI_BASE/$PROGDIR/TEXLIVE_$YEAR

echo "MSG: Installing TeX Live files..."

# extract our texlive installation
cd /$PBI_BASE/texlive/$YEAR/ && \
    lzma d texlive${YEAR}.tar.lzma -so | tar -xpf -

# prompt for a default paper size
if [ $INSTALLMODE = "GUI" ]; then 
    PAPER=`kdialog --combobox "Select default paper size" \
            "A4 Paper" "US Letter"`
else
    echo -n "default paper size [A4 Paper, US Letter]: "; read PAPER
fi

echo $PAPER | grep -i "Letter"
if [ $? -eq 0 ]; then
    texconfig-sys paper letter
fi

echo $PAPER | grep -i "A4"
if [ $? -eq 0 ]; then
    texconfig-sys paper a4
fi


echo "MSG: Updating TeX filename databases"
texconfig-sys rehash


if [ $INSTALLMODE = "GUI" ]; then 
    kdialog --yesno \
    "Congratulations! Installation complete -- test installation?"
    if [ $? = 0 ];then
        cd /tmp/
        $MY_TEX_PATH/pdflatex sample2e.tex && kpdf /tmp/sample2e.pdf
        if [ $? -gt 0 ]; then
            kdialog --error \
        "The test may have failed to load, please report any errors: $?"
        fi
    fi
    echo "MSG: Displaying readme file"
    kwrite $PBI_BASE/$PROGDIR/PBI_README.txt 
else
    # I don't know a programa off hand to use for console output.
    more $PBI_BASE/$PROGDIR/PBI_README.txt
fi


What is wrong with that script? Nothing compared to what is wrong with the PBI script API.


$PROGDIR should expand to /Programs/ProgNameVer instead of ProgNameVer.

The maintainer shouldn't have to treat GUI and TEXT mode differently - the API should do it. In fact, that was the main impetus for the EPI spec creating platform independence. In an EPI, people would have said ask the user a question. Not have to understand shell script and massive toggles for GUI/TEXT mode install.


I believe PBI now also do similiar idiocy in regards to fonts handling that should be done at a higher level. However, according to the officially published documentation, my old script is still the correct way to go with a PBI.


One of the reasons I gave up on Kris Moores PBI, was the brain damaged design, the other reasons were the total miss management of PBI...



Under the draft spec's I wrote for EPI, the above script would have become something like this:

# optional EPIL script for EPI

define question dialog with
        title = "TextLive EPI"
        message = "Please set a default paper size"
        default = "A4 Paper"
        options = ['A4 Paper', 'US Letter']
launch as paper_size

if paper_size == "A4 Paper" then do
        execute "texconfig-sys paper a4"
else
        execute "texconfig-sys paper letter



Now I ask you, if you were not a programmer, which one do you understand? Then let me ask you, if you just want to get on with your life, which would you rather write?


Both the PBI sh and EPI EPIL scripts would result in a simple dialog asking you to choose from one of two options. If you know how to use TexLive, the EPIL script would be totally unnecessary because you already know how to change paper sizes. That means you don't even have to write a script at all in a normal case.



End result? Oh so simpler.

Thursday, October 15, 2009

Marley & Me

Tonight I was watching a movie called Marley & Me. The films an excellent glimpse into what life can be like, for anyone who hasn't bee living with their head so far up their arse to notice 8=) It's about a couple of newly weds, who trade the winter for a warmer location in Ft. Lauderdale, south Florida. In order to stave off his wife's plans for a family for a few years, journalist John Grogan springs an early birthday present on his wife Jenny: they adapt a yellow Labrador Retriever, the clearance puppy. While successful in his intentions, Johns plan backfires when young Marley proves to be one of the worlds most hyper destructive dogs, yet too much of a lovable lug to just get rid of anytime soon. The movie charts a course that I would call "A slice of paradise" with all of it's pitfalls to go along with it. Marley & Me follows the lives of the Grogan family, and dear but incorrigible Marley, If you love dogs, you'll love Marley, if the exact opposite is true, well you'll be relieved not to be in such deep doggy waters >_>. I love animals, always have and likely always will; having dogs, I can also be sympathetic to the whole ruckus caused by Marley. Hmm, for some reason I can't help but remember a dog named Milo, that I used to help look after years ago as part of this business; he too failed obedience school (horribly lol).

For me, I would say it is fair to say that I feel a bit of a personal connection with this film. Fort Lauderdale is the city that I grew up in as a child, so Broward county is a name I'd know anywhere, and the news paper in South Florida, is also one that my parents used to deliver for... it's a small world, isn't it? Even closer to home then that, their first sons name, Patrick was also the name of one of my elder brothers: most people that know me fairly well, also know that I have an older brother but there's more to my families history then that. In short, my brothers Reese and Patrick were twins, but only Reese survived. My father also had a son, long before I was born, but Jeffery never quite made it into this world :'(. Whatever the afterlife holds, one thing that I have always hoped, is that someday I'll see us all together in heaven.


As everything must someday, life on earth eventually comes to an end, and Marley is no exception to that rule. Years role on, and take their toll: Marley grows old, as we all will some day. The ending is very sad, but I would have to say that he had a very good life, and it was one full of much chewing too ;). I think that perhaps, this line from the film sums up best what it's like to have someone like that in your life:


John Grogan:
A dog has no use for fancy cars, big homes, or designer clothes. A water log stick will do just fine. A dog doesn't care if your rich or poor, clever or dull, smart or dumb. Give him your heart and he'll give you his. How many people can you say that about? How many people can make you feel rare and pure and special? How many people can make you feel extraordinary?


If anyone can watch Marley & Me, and take the ending without drawing a tear along the way: I truly feel sorry for any animals in that persons care.



We all have people in our lives, animals as well, that are such an integral part of our life, of every thing that makes it worth living. You can't hope for more then that.

Monday, October 12, 2009

of passwords and tags

Having some spare time, I set to updating my code books while I wait on the Google Wave video to reload; I had to pop off at ~55min for a Live Operation ;). Ahh, crud! Now it looks like I'm going to be interrupted by dinner 8=).

Around late April, I devised a new schema for how I handle my password management—yes I am insane. I actually do have an encoding system for increasing password strength, that is tied to how my brain functions. It's built from a system of standard passwords that are pseudo-salted with mnemonics that are tied to both the schema and how my brain internally organizes data. The end result, a systematic password that is more difficult to brute force, and hard to guess unless you are me ;).

I've just converted about 50% of my core services to the new password system, and I'll do the rest tonight.


At the moment, my bookmarking has been focused around delicious. The pattern I have chosen for tagging is roughly the same as what I've been using in Gnolia; Category/Sub-Category/... and in place of larger meta tags (e.g. Programming/Perl) I've bundled the tags. It's an informal hierarchy but rather nifty. Gnolia lacks bundles which I dislike about it, but honestly the only thing Delicious's method offers in improvement, is you can't forget to add the meta tag along side the regular one (e.g. Programming/Perl/Frameworks).

The tags I've arranged are strongly organised, I like keeping things tidy. There is a simple "Personal" tag for things connected with me, Services/ for various services: Services/Webmail for example contains marks for Google and Live mail. Programming/Language/{Distributions,Documentation,Modules,Frameworks, ...} is also in use


For some reason, I love organizing crap.... lol.

Sunday, October 11, 2009

Something about this song just grows on me

Mister Officer, I didn’t mean to speed
No I didn’t know I was pushing 90
Mister Officer it’s such a pretty day
Can’t you let me go with just a warning

Mister Officer, I’m so sorry
Yes I know that this is not the autobahn
Mister Officer this isn’t like me
Yes sir, yes I have a reason

I had the top down the radio blaring through the speakers
I think I blew a speaker out
I wasn’t trying to be reckless
I wasn’t even in a hurry
I just can’t stop thinking bout him
Can’t stop thinking about him
Shouldn’t be thinking while I’m driving but I can’t stop

Mister Officer I didn’t see you
And I guess I didn’t hear your siren
Mister Officer I know you’re serious
I can’t help it if I can’t stop smiling

I had the top down the radio blaring through the speakers
I think I blew a speaker out
I wasn’t trying to be reckless
I wasn’t even in a hurry
I just can’t stop thinking bout him
Can’t stop thinking about him
Shouldn’t be thinking while I’m driving but I can’t stop

I had the top down the radio blaring through the speakers
I think I blew a speaker out
I wasn’t trying to be reckless
I wasn’t even in a hurry
I just can’t stop thinking bout him
Can’t stop thinking about him
Shouldn’t be thinking while I’m driving but I can’t stop
Can’t stop thinking bout him
Can’t stop thinking about him
Shouldn’t be thinking while I’m driving but I can’t stop

Mister Officer, I didn’t mean to speed
No I didn’t know I was pushing 90
Mister Officer it’s such a pretty day
Can’t you let me go with just a warning

-- Mister Officer, Jypsi



Friday, October 9, 2009

Bugs can be fun, as long as I didn't write them

In order to make optimal use of tonight, I did a portsnap and fed a list of ports to be updated, into my updater.sh script; then went to work on playing with pthreads. A short while later, when things got to ImageMagick, I got the shock of my week—pkg_delete crashed during a make deinstall!

In looking through the code, I've found the reason why, there's a package name passing through the code as a null by the time it finishes passing through pkg_do in src/usr.sbin/pkg_install/delete/permform.c. From the looks of things, it goes bonkers once Plist is setup via read_plist(). Hmm, well well, the Package (_pack) structure being passed to it has rather interesting contents at the time.

I just don't have much more time to fiddle with this damn thing! I've got to be up for another groaning day of work tomorrow.


OK, found it, there's some funkyness here. When it hands off to add_plist (basically every damn thing in the bloody +CONTENTS), it has NULL'd the dohicky that gets copied in later. read_plist() sucks a file line by line, looks like if the trailing character is a space, read_plist() sets it to the null character (\0).

That creates a bit of a problem, because the +CONTENTS file for ImageMagick has a line '@pkgdep ', which results in pissing off the whole damn thing... lol.



So... how to handle this problemo? I see two things: 0.) pkg_delete should NEVER FUCKING CRASH!!!! No matter what is in a +CONTENTS file, at least, that is my opinion!!! And 1.) if '@pkgdep ' is not valid in a +CONTENTS file, whatever causes ImageMagick/ports to be shoved there needs to be found and fixed. Digging into +CONTENTS file creation is a beast for another hour. Why the pkg_delete program chooses to pass a NULL through I have no bloody idea, maybe shifting through CVS logs might hold the answer to that mystery. The pkg_install suite has some rather ugly and quickly hacked together parts, that really makes me wish they had used shell or (like OpenBSD) imported Perl into the base for the job, rather then doing it in C. Don't get me wrong, I like C, but please don't write functions with over 1,000 Lines Of Code ;). Either way, when it comes to fixing the pkg_install issues... that's something I'm not going to touch unless a developer suggests what they would like to see in a patch, because whomever is maintaining it, should have a better overview of things then I do at the moment; I'm in no shape to do any more thinking tonight. Perhaps I'll just file a bug report on it and see what comes of it.


Right now I just need to get some freaking sleep before work. Ugh, stairs here we come.......

Thursday, October 8, 2009

Interesting tidbit: Twitter Talking Separately to Microsoft and Google About Big Data-Mining Deals

Shared from Google Reader

Twitter Talking Separately to Microsoft and Google About Big Data-Mining Deals—BoomTown


If this is how they plan to make a crappy search engine (Microsoft's) better, by trusting in Twit'wits, oh boy and I glad that Mage introduced me to Google all those years ago lol.


I've used several search engines over the years, in the end I have no respect for Microsoft's offering (Hotmail has improved, search has not) and some respect for the one over at Ask.com; which I was introduced to when some program (thankfully) changed my default search provider in IE. Since I've never used Yahoo! For searching, I generally skip commenting on them. Microsoft how ever, I used there search engine for many many years, dating all the way back to WebTV—at least Google tries to find what I'm looking for ;).

Notes over lunch

Work yesterday was fairly uneventful, so it left me time to concentrate on programming, doubt Friday will be so lucky... lol.

I worked out the basic architecture for pipin's daemon, and have had the details on my mind for most of time since ~1400Q yesterday. Current on the hit list, is a pair of prototypes: one built around libpurple, that is to deal with the issue of a bare bones get connected task, the other is an experiment in moving the "Gatekeeper" component into its own thread. Once both prototypes are done, I'll look at merging them and re-evaluate how they play together versus mucking around with GLib's main loop. Conceptually, pipin-imd consists of three units: purple, dispatcher, and gatekeeper. The purple unit deals interfacing libpurple into our own kit; the dispatcher in mapping between purple/pipin-im events and notifying all registered clients; and a gatekeeper to manage incoming data from pipin-im clients. I also have an idea of how the communications protocol between daemon and server might work, not to mention the fact that I want a simple net command shell that would allow communicating with the daemon via a shell or batch script lol.


The plan is for the daemon to be written in C and licensed under the GPL, since purple forces use of GLib, there is no reason to use C++ for sake of the Standard Template Library. Whatever the legality of using Python ctypes based or SWIG generated code to interface with libpurple is, I doubt it would be in the spirit of the damn blasted GPL, even if the license was less restrictive >_>. The client unit, I plan to write in Python using whichever widget toolkit proves most appropriate (Qt, GTK, Wx). The daemon is a pretty simple program, the client side stuff gets all the fun, and a license more in line with my ideals of freedom, then the uglicious GPL.



Originally I had planned to work on the threaded gatekeeper prototype during the time before dinner and afterwards, but never got around to it. It wasn't a good nights sleep, but at least I went to bed early for a change.... lol

Wednesday, October 7, 2009

Tuesday, October 6, 2009

Present open-loops

  • General SAS related work
  • Making my game prototype load key=command mappings out of an rc file
  • Finish Cara's skins
  • Get to work on pipin-ims backend daemon
  • *Several* personal but business related web-projects
  • Continue compiling my letter about PC-BSD shortcommings
  • Setup/test CppUnit, Boost, and POCO
  • Customize Irssi and evaluate its possiblities for replacing Weechat
  • Figure out how much crap didn't make it onto my todo list before it fell off
  • Work on the SAS Tactical Command Interface
  • Work on several other small SAS-related projects

and all I can think about right now is a quick snack.... lol

Sweet, Google Reader can send things to LiveJournal

Like many people, I often have things that I want to follow, but can't be arsed to check up on periodically; the solution of course is RSS or "Really Simple Syndication" feeds. The age old problem is the bother to actually _look_ at the RSS feeds in question lol. A while ago, I switched to Google Reader during a period of reorganisation; a topic that I should probably revisit in a few weeks.

While much of my life is an open book, most of the services I use are not very integrated even when they are capable of it; this is mostly by my intention! Most people on planet earth and beyond, can reach me via instant messenging—the prefered way, since I'll hang ya if the phone rings >_>. My LiveJournal is my 'personal' place, and perhaps consequently one of the most public. There are other mainstream services that I've come to employ, which kind of creates a bit of an onion approach to my data lol. LiveJournal serves me, I don't actually care whether anyone reads it or not, after all it replaced mounds of log files and such, and that is what its principal purpose was and still is ^_^.

One of my friends makes use of Google Reader, so I've started exploring Readers ability to share and comment feeds with others; which lead me to this little puppy: “Send To” LiveJournal and Iterasi for Google Reader. Combined with a few other tidbits, this might get more frequent use: most things of interest to me in regard to RSS feeds, end up noted int his journal anyway, well if time permits lol.

The entry I've setup in Google Reader, thanks to the help of that link, results in exactly the kind of thing that I want: a suitable subject (that makes searching my lj easy) and a message starter that I can live with.

gnolia tagging ideas

I've been thinking about how to organize tags on gnolia.com, mostly I want something more hierarchical in nature then just top-level tagging. It doesn't really need to be built into gnolia, just something convenient.

There are two schema that I am thinking about, utilizing a file path style: e.g. programming/c++/qt or the namespace syntax my wiki software uses at home: e.g. programming:c++:qt. Then again, there's always programming.c++.qt and all sorts of other combinations. Really, I think I like the Java/Python style dotted namespacing or simple folder paths. The issues of dealing with separators that may be utilized within a tag name is always a factor to consider though#—but shouldn't be a problemo for me. Why do I feel like reading a book on database normalisation? One nice perk, i could always design a little bookmark mangaging gizmo as an external application or an AJAX page. Gnolia seems to have moved developer documentation into the wiki, so it shouldn't be to hard. I could cook up something client side fairly trivially, but doing it as a web app might be more 'fun' in the short term, mmm!!! Short of writing my own program, there's really no way for the UI to bundle tags within tags, that I'm aware of. (e.g. tag "Google" points to tags "Google/Services", "Google/Software", bookmarks for account "Preferences", blah blah). A simple tags cloud sorted A-Z however, makes it fairly easy to see that kind of relationship though.

I may also play both decks at once, and try managing my data with delicious at the same time; with a review to be made after a month or two. And of course, I can always whip up my own booksmark gizmo and place the server on Vectra for safe keeping.

Manually refreshing Windows desktop wall paper

Every now and then, the windows box ends up with "No" wall paper; typically due to issues with certain poorly created Unreal Engine 2 / DirectX games >_>. Earlier today, someone asked me a rather stupid question that brought me to thinking, is rundll32 even documented?

In poking my nose around, i found out that (as expected) Windows XP stores the name of the wall paper as a path to the BMP file in the registry; more specifically a REG_SZ in the form of: HKCU/Control Panel/Desktop/Wallpaper. In theory I could write a program to manipulate that value: then hook it up to my wall paper changer in place of utilities like hsetroot, mauhuahaua! The problemo is that Winsucks only seems to read that value on log in/out. The solution?#—Google. Third party programs can change the wall paper in real time, so obviously there has to be a way of doing it (hey, Windows does...), and As far as I know, most such routines would be tucked away in the shell32 and user32 libraries some where; enter rundll32.


rundll32 user32.dll,UpdatePerUserSystemParameters


problem solved ;)

Monday, October 5, 2009

A return to [ma.]gnolia.com

Today I introduced a friend to the tagging concept behind services like del.icio.us and ma.gnolia, and used the wikipedia articles[1][2] to give her a quick tour of things. In thinking about it for a whilef, I've realised that I ought to look into this again myself.

For a while I was thinking about building up my bookmarks again, more formally then my cobweb riddled brain pain that is. Today I figured that would mean using del.icio.us, if I was ever to go with a replacement for my old ma.gnolia account. When ma.gnolia went down, it basically took all of my bookmarks with it (unrecoverably). After that, I basically stopped using bookmarks lol; in fact the only one I'm actively using is a quick link to my usual radio stream, which I load every time I login. Everything else is something I know the URL of, or can Google for the exact address (I love a good search engine more then an old phone book lol). The only thing that I actually miss, is some of the programming articles I had tagged, but bookmarks are a fairly painless thing to loose; well, for a geek like me anyway there's much worse.


In looking at ma.gnolia's website, it seems that the wikipedia article is outdated. After a renaming to gnolia, it would appear that the service is alive again. While I expected to end up on delicious (sad to see the domain hack go :'), the revived gnolia is a bit of good news IMHO. I don't know if I'll ever establish as large a bookmark collection as I used to have on ma.gnolia but this time as an extra safe guard, I intend to make regular exports every couple months. Perhaps I'll even check in the file into a git repo, dunno.


The really big question, is how to organize my tags.... hehe. Ok, sure my desk, room, $HOME, and everything else looks like a bomb hit it! But it's tightly organised, so I can actually find crap when I go looking for it ^_^.

The glory of Raven Shield / Unreal Engine 2....

OS: Windows XP 5.1 (Build: 2600)
CPU: GenuineIntel Unknown processor @ 3003 MHz with 2045MB RAM
Video: NVIDIA GeForce 8400 GS  (8250)

Assertion failed: Actor->ColLocation == Actor->ColLocation [File:.\UnOctree.cpp] [Line: 1703]

History: FCollisionOctree::RemoveActor <- ULevel::MoveActor <- NormalSubUzi37  <- UObject::ProcessEvent <- (R6TMilitant04 Alpines.R6TMilitant31, Function R6Engine.R6Pawn.SpawnRagDoll) <- AR6Pawn::UpdateMovementAnimation <- AActor::Tick <- TickAllActors <- ULevel::Tick <- (NetMode=3) <- TickLevel <- UGameEngine::Tick <- UpdateWorld <- MainLoop
Both Raven Shield and SWAT 4 display crash messages like these, so perhaps it is an Unreal Engine 2 thing rather then specific to RvS/S4, but if it is, I would assume there is a way to turn it off. My feelings: This is good stuff to see if you are one of the games developers or testers—but should _never_ be seen by retail customers! Not only is it Martian to regular people, since we can't go edit and recompile code ourselves, all it does is display information that we didn't need to know. If I was going to do something like that for crash handling in a *release* product, I would probably make it said "Programmer fuck up, please sue the company for idiocy" :-) This seems to remind me, of one time I was on the website of a large north-american company, when for doing nothing at all but routine, their website gave me the most interesting error messages.... telling me enough data to find out several server side paths, there otherwise hidden implementation language, and enough data to clue in on what "stuffs" were being used to make the whole show go. I nearly died laughing lol. Maybe I'm a freak, but I don't think user should be allowed to see developer information in a closed product like that.
post script: this was my 1500th journal entry

Sunday, October 4, 2009

Saturday, October 3, 2009

Despite a more miserable then not day, I've actually made some progress with my game project. Fetched Ogre 3D's trunk via Subversion, built it with CMake/MSVC, setup a suitable SDK spot, then got my project building against it with CMake and executing.

The main adjustments that are needed atm, is building up the configuration file handling and implementing the principal movement commands that remain (e.g. creep, walk, run, sprint). There really is no game engine in the traditional sense, because I don't want any of the ones I've looked at! My intention is to refactor the prototype into a suitable baseline for use with other games I would like to build in my free time; I hate to repeat myself :-P.

It's still very early for me, but I think I'll turn in for the night, after I have the servers prepped for tomorrows Live Op. I'm interested to see how it will go, and very much wondering who will end up as the Element Leader, hehehe.

Quick note to self

Setup Boost, Doxygen, and CppUnit on SAL1600.
Experiment with POCO when time permits: http://pocoproject.org/

Rolling an idea into an operation!

Well, my live operation is all but completely setup for Sunday; all I have to do is zip up & attach the intel photos, get the map files ready to rock, etc. I think I've taken about eight to ten hours in prep for it :-/. It takes me all of like, five minutes to come up with an idea for a live operation, then I spend time cooking up more details and brain storming.... once them creative juices get flowing, watch out! You just never know where we'll end up before zero hour lol. As stressful as it can get when Murphy leaps out of no where, I really do love LOs in SAS.


I posted a short message in the LO forum on SAS this afternoon, and spent parts of the afternoon / morning working on the details, including an OPORD. This one is going to have two maps to it, assuming the element doesn't die two minutes after insertion or fall down a vent shaft, haha! I am very tired of how many delays we often bump into with the start of a LO lately, so this time out, I've enacted a stricture—you come late, you sit this one out in the TOC or can go buzz off. This live ops beginning strictly on time, if I've gotta grease the wheels and push; considering that I've given everyone ~48 hours notice of the launch window, and will be tapping the assault team at least 15 minutes early for setup / briefing; there are no excuses!!! I think the assault team is going to have an interesting feeling, if they make it to the end of the first map, muhuahauhauahuhaaua!!!!


You know, I could make a fortune if people would pay me to think of contingency plans and potential attack vectors, just look at my service record with the EVR lol.

Thursday, October 1, 2009

Personal Training Cycle, 2009-10-02

Map 0: Meat Packing Plan 1, RvS Mission
Kit: Medium vest, MP5SD5+Scp, P228+HCM, 3 Bangs, 3 Gas
Results: Bullcrapped twice during clearing but aced it on the third go. Snuck in covertly until I was forced to cap a sentry through a window; no alarms were raised. Proceeded to secure the building room by room, penetrating into the upstairs ands ecuring my backdoor. Normally in server, we usually take the easy hostage first with bangs then go for the hard ones; for me, not so! I'm sorry, but I just don't feel it is realistic to bang the downstairs kitchen, then creep into position upstairs ^_^. I setup an ambush on the catwalk; downed the guards with my H&K before they could raise an alarm - pick and choose your angles carefully. Moved the catwalk hostage into the hall for safe keeping, then I hit the gas pedal—full dynamic mode: blew through the next office with a flashbang, snatched the hostage, then moved them both to the next assaulting point. Dropped them off, plunked a pair of aces in the next hole, then stormed through with my H&K leaving nothing but the third hostage alive. Collected all three hostages, then moved out for egress by a different route then my ingress; taking it cautiously but expidetiously through the facility. Extracted the two hostages, then went and capped the last tango who was wondering around trying to follow me lol.

Map 1: Penthouse, RvS Mission
Kit: Medium vest, MP5SD+Scp, P228+HCm, 3 Bangs, 3 Gas
Results: Failed the first run do to being a moron, got spotted on the second go when the office door jammed on me. Third run was like clock work... Solid Snake couldn't be more of a sneaky son of a bitch. A lot of people use Heart Beat Sensors and Smoke grenades to clear the Penthouse mission in RvS, me... I don't believe in it for training purposes, it should be hard ^_^. I went in using no tactical aids, just two eyes and a beady little brain to sneak through undetected.

Map 2: (MP) Presideo, RvS Terrorist Hunt
Kit: Medium vest, MP5A4, P228, 6 Bangs
Results: Crept into a suitable place to light the fireworks, then moved swiftly at a controlled pace, securing the entire building. Got shot in the right shoulder, after getting "Overly zealous" on a dynamic entry but the poor tangos were too unprepaired to stand a chance. Had to track down the last terroristm, who was probably on my heals the entire map but just to slow to keep pace; hit'em with a flash bang and a semi-auto shot to the head.


All engagements were done fully automatic with the H&Ks, except for the very last tango which had to strong cover to warrent the chance at that distance (I had no scope).


If dinner wasn't ready, I'd go for the SWAT 4 portion of my cycle, but it will have to wait until tomorrow :-(.

Don't sleep debts ever get paid back?

The A/C unit desired to act up last night, so the work schedule had to be adjusted accordingly—not to mention being driven bonkers half the night. Luckly this morning only cost a perfectly nice dream... and 3 hours sleep lol. So far I've had two pour naps tonight, but no real sleep since :-/. Been running on an average of ~3 hours for work days, and around 5-6 split into segments on days off. If there's such a thing as a sane sleep pattern, I doubt that I'll ever see it again.


I've spent most of the day alternating between SWAT 4, Raven Shield, and transferring notes into my internal wiki; at least I'll be able to decrease the size of my home directory, once I have vectra configured to automatically back up the wiki files as well! I've maintained notes and copies of documents for years, and on/off have been trying to down size it over the past several months. Such things change much less often then the rest of my home dir, so it is worth the transition; it'll keep the dumps smaller. Right now, my main concern is actually the life spans of hard disks: less data to dump, less disk activity per backup cycle. Around december or janurary, it'll be time to start work on the now bi-yearly CD-ROMs.

Most people I know, are nuts about overwriting their files, me, I'm paranoid, so all I worry about is massive data loss, that I didn't cause my self #&62_>. The lack of a recycle bin on UNIX systems actually fits my brain well, if your going to want it back, why on earth delete it in the first place? Somehow I think the world would be a better place, if more people learned how to place important documents under version control and just be done with it!


Tomorrow, eh, now today... is going to be a long day, the only upside is work doesn't start until the afternoon, so no need to get up early - I hope. I really don't feel like sleeping right now, but it's that or be running another 26-28 hours on no food nor sleep +S. *Sigh* I could get so much more done, if days were 36 hours instead of 24, or if life here was quite different...