Rj Ponders-Losing time correcting my dirty codes

awake_title

 I am having quite the problem currently. I learned that the quick links that I have been using is actually very dirty. Its seriously slowing down the site, and finally it created errors. I thought it was possibly other things causing the slow downs. I’m so busy its hard to get to maintenance or expand the site.  Literally dealing with Absolute Zero alone, maxs me out.

I had to figure out what the problem was when I continue received scripting errors. Its getting closer to 4 AM, I have not slept at all. I wanted to finish creating the Robson series, and work more on the Absolute Zero Blog series “Return..”. Now instead I am fixing the dirty code, that I have created and compounded. Its so slow and long, to hand code everything just for the Youauthorus Quick update navigation page.

I will later have to do this for alot of the links I have created in a dirty fashion. To help you avoid problems like this I will tell you why I had this problem. Instead of going to the page, copying the web address, then returning to the text and then manually linking it. I instead chose to use text that was pre-linked(automatically) in the title of every page, post, tag, etc. Well, It’s not much of a problem if you don’t have any analytics programs. However if you have anayltics programs, the tracking codes on the automatically generated links remain. What happens is that the computer spends even more time, searching the extra anaylitics code and redundantly gathers information. What happens if you have the same code repeating on the same page, it creates a benign error. This just slows down the page to a crawl, and makes it difficult to be productive.

To fix, as I later discovered; I either could go into the code and remove all the tracking code or create fresh cleaner code(appearing phenotypically the same but having less recessive information). I tried the first method I listed, there was just too much dirty code from copy and pasting so much information. I tried, using automatic code cleaning tactics from 3rd parties software but that did not clean the dirty tracking code. So, it seems I am forced to literally hand code all of the information back. If you notice people not doing something, caution it. Either way it was a learning experience, would not have known about this if I did not go through it. I will be more economic with tags, and links. So, don’t use dirty links, and or code, it takes more time to correct the damage it does in the long run.