Repost from 6 years ago: Improving placement of your website

I have been told that it is an arcane art getting your webpage selected for prominent display by the search engines (getting the first listing or on the first page) and that it requires the efforts of a highly paid professional. I reject this notion; my experience has shown that a few straightforward and common sense tactics can get any web page favored by search engines. The various search engines are sufficiently open about what they “score” highly for selection.

Recently, Google and Bing posted notes regarding changes to their scoring schemes; both now favor pages that include content for the vision challenged in addition to favoring “living” and “dynamic” pages. This is good news all around. The web has become very visually oriented; to the extent that, until recently, the sight impaired, and those with limited motor control (and a vast array of other situations that create challenges to experiencing web content) experienced the web in an extremely reduced quality. Whatever the reason you are building a web presence, providing your message to the maximum audience is desirable.

Being mindful of the newer guidelines for the search engines not only helps you get higher on the page but also allows you to include more of the potential audience. A marketing specialist can help you get a higher percentage of reached individuals to engage with your content while building pages that the search engines favor helps you reach a larger audience; both factors are extremely important in forming a successful campaign.

There are simple things we (as builders of the website or blog) can do to achieve favor within these new guidelines. When posting a photograph, try fully describing both the photo and its association with the text or other materials surrounding it; often this can be entered as text in the “alternate text” field associated with the photo. Similarly, when embedding a video or a table of numbers, or a graph, fully describe it and how it applies to the material it is meant to enhance or exemplify. If you are using WordPress, media items (photos, videos, sound recordings, charts, etc.) can easily be annotated as you place them from the media library with a title, a caption (visible to all), alt text (used by screen readers, search engines, visible to those requesting text mode or whose browser settings prevent picture loading), a description (available to the same group as alt text). All of these can assist in getting search engine favor and in helping to identify the importance of that media to both the search engines and all viewers. A quick hint for WordPress users; if you will use a media item multiple times and want different annotations, create multiple copies because the annotation stays with the library entry, not the “instance” on the page.

As you format the “page” remember that we want the page to be dynamic; in this case, I refer to the ability of WordPress and other page layout products to format the page differently to optimize its appearance and function for various kinds of viewing devices (smartphone, tablet, PC, Mac, etc.). This can involve choosing larger fonts (14pt and above recommended for the small displays and the visually impaired); avoiding hard-to-read fonts (artsy but more difficult to read for all), and good white-space balance (this is many things but it amounts to adding white space to make paragraphs easier to spot, photos, tables, and graphs easily identified with the text they enliven but not jammed into, and more). Understanding that the page will be reformatted based on the viewer’s device might lead to using larger or smaller elements (media or text); but, generally, the use of a moderate-sized and easy-to-read font for all text will go a long way here.

Breaking up parts of the page to separate different thoughts you are expressing may also assist the reader (particularly after the dynamic page layout engine shapes it for small displays).In addition, you want to include “tags”; this is your chance to give the search engines words or phrases that you think your content explains or is the answer to. If someone was searching for “this tag” I want my page provided at the top of the list or prominently in the list. This makes “tags” an important consideration and a powerful tool to get your page presented to those searching for your offerings.

Depending on what tools you use to create your pages, “tags” will be available in different ways; in WordPress, each post and each page has a field that you can paste your tags into. A quick step back; when we were filling in the alternate text and description fields for media earlier, we would like to include (smoothly, artfully) those same tags in the descriptive text for those media. Back to “tags”; these are not sentences, the search engines are looking to match a keyword or 2, maybe, 3-word phrase and the “tags” themselves are primarily here for the search engines. It is extremely rare for a viewer of a page to dig in deep enough to ever see the tags associated with it

.Now that we have some content on our webpage, hopefully very attractive to readers and search engines alike, it will help to go to the various search engines and “register” our page. In this step, the search engine is, once again, informed of the “tags” for our intended content and audience and you may choose to pay for higher placement or for advertising space. Some of the engines will allow you to enter a description or even a lengthy briefing designed to grab attention to your link(s) and provide business and or personal information (address, hours, phone numbers, email address, etc.). This is a process you can do or have a service or even an app do for you; I prefer doing it myself as it often gives me hints about what the search engine is looking for and what other entities are doing that might be in competition with me.

Computer Glasses

On a topic slightly out of my expertise areas but well developed in my experience. Optics (glasses) for working at a computer. This is an area where I am somewhat spoiled; yes, I have specially made glasses just for working on my computers. And, yes, I feel they make a huge difference.

Last time I had my eyes checked and a new prescription for distance viewing, I also had my optometrist write me a prescription for computer glasses. We discussed the distance from my eyes to the surface of the monitor and she (my optometrist) set up a set of test lenses for that exact distance and we tried a few options and picked the best possible correction for me at that distance.

With this prescription in hand, I ordered my computer glasses including a blue light filter, the latest in UVA and UVB filtering, a non-reflective coating, and a dust resisting and water-repelling coating. Even the latest LED screens cause ionization of particles and tend to cause glasses to pick up dust far faster than normal; this new coating (Crizal is one choice, , Zinni Optical has another “brand”) really does make a difference.

My last pair of glasses I had added polarization on top of all the other features and found the polarization to cause some trouble (tilt your head and the screen appears to go dark); so, since I use them exclusively indoors, I did not add polarization or any form of darkening this time and, I admit, I am happier with this for computer glasses.

Spend some time talking with your Optometrist regarding your specific needs and desires for any common work spaces you utilize. There are so many wonderful tweaks that can be managed to improve your vision in specific environments and the cost of most of them is very reasonable. Task specific optics often run just $40-$60.

5g Cellular and Radiation

With the rollouts of 5g starting, one question, at least, remains unanswered, is the radiation from the lower band 5g signal going to be dangerous for the users? I have done some research and reviewed research done by many others including the US Navy and WHO (the World Health Organization). The first step was to figure out what this new standard is and exactly what kind of transmissions, both active and passive, its adoption will bring to our environment.

5g is being adopted differently by different carriers. AT&T thus far is adopting some of the technological improvements of 5g and applying it to current cell use bands, resulting in enhanced 4g (not really 5g) providing maybe a 30% improvement in transmission throughput and reduction in congestion. Verizon has adopted a 3-band (spread spectrum) approach with the low-band being around 600mhz and the high-band 24ghz – 52ghz. The mid-band is expected to be in the historic cell band at 800mhz to 900mhz ( a sweet spot for human exposure safety as determined by US Navy testing with radars back in the early 1900s). T-Mobile appears to be focusing on the low band (around 600mhz) with their current rollout.

There are two reasons for moving the new service to a different frequency band; 1, is to get out of congested bands where competition for spectrum prevents the growth of bandwidth through using wider frequency bands and 2, the 600 Mhz band is better at reaching out in rural areas simply by the nature of its lower frequency (higher frequency transmissions are more line-of-site, and won’t bend around or pass through foliage, or other obstructions, near as well as lower frequency transmissions). As a trade-off, it is expected that the new towers for 5g will use roughly twice the signal power of the current 4g towers (to compensate for lower frequency transmissions requiring more power to achieve good signal quality in a given path). Verizon is testing its GHz system exclusively in event locations and large cities where the extreme line-of-sight and short-range character of those transmissions are a benefit instead of a liability (think of small low power transmitters on every street corner or well-spaced within a stadium).

Okay, so with a real rough notion of the characteristics of the new standard, what are the concerns surrounding this change? According to the WHO, as the frequency drops from 900 MHz, the ability of the signal to pass into and cause changes in human tissues increases considerably, or said another way the absorption rate of skin tissue increases. Add to this that the 5g towers will be transmitting with more power (as yet, no clear word on whether the hand units will use more transmit power for 5g) and there should be an expectation of far greater exposure to humans.

What do we know about safe levels over long exposure to RF radiation? Research on this topic goes back over 80 years and it appears that exposure sufficient to raise tissue temperature 1-degree Celcius has been a threshold for when deleterious effects can be expected. ( Anyone remember using old cordless phones that caused your ear to feel hot?). How much signal is required to produce this effect? A guideline from WHO indicates that 4W/kg SAR (specific Absorption Rate) (the result of 4W PEP at a distance of 1 meter from the source) meets this threshold and will cause both tissue temperature increase of at least 1-degree Celcius and generalized impacts on the living being including behavioral changes, induction of lens opacities and adverse reproductive outcomes. Induction of cancers and similar maladies requires more study but may also be probable at this level of exposure.

Given these vague indications, we can expect phones using the new standard to cause more exposure to RF radiation; but, much more study needs to be done to determine if actual phones using this standard will cause or increase the incidence of related health and mental issues. The crux of the situation is that the new phones could operate using sufficiently low power that no significant tissue damage will occur short or long term; but, I don’t believe appropriate research has been done yet to know one way or the other.

Windows 11?

I see rumors flying around about the next version Windows. One writer suggested it will be called Windows 11. My impression is that it is more likely to be called Windows 10 spring edition or something of that nature or a remote possibility it will be Windows 12 or Windows 20 (for Windows 2020). Microsoft has been making noises using “new” language for their Operating System to come; but, I don’t see that as an indication of a new name for the product as much as an attempt to get a more standardized way of talking about the product family.

How soon will we see the next version of Windows? Well, for many of us we recently received updates that, historically, would have been touted as a new version. Version 1903 started being pushed out to users in May of 2019 and by September many of us had version 1909. Since the desktop was not significantly changed, most users don’t notice, and I consider that a good thing. Windows 10 is finally looking like a mature product and some trust is being developed in it as a stable platform for getting work done. I look for a similar time table for Windows 10 version 2003 to appear in 2020.

I would like to see a few major changes in the new versions; first on my list is a much-improved update process, followed immediately by a change in how software installs in Windows. We can see a hint of the change I am looking for in the new “Apps” which are like programs but make smaller changes to Windows as part of the installation process. I want to see the “new” Windows totally separate the changes made during install from the registry for the operating system. If this means having 2 registries, one for Windows and a separate one for applications and programs, I am all for that. Let us see an end to the installation of one program having any chance of affecting that of another related or unrelated program. Similarly, let us see an end to Windows upgrades having any impact on the operation of installed programs.

What little insider information I get indicates that a new upgrade system resulting in a completely different user experience (no user intervention of any kind anticipated in the new upgrade system) is coming soon to Windows; but, no dates to expect that yet. At the same time, Microsoft continues to develop new Apps for Windows 10 to continue fleshing out what they plan as a complete user experience.

I missed Black Friday, did I miss the best deals?

Black Friday and Cyber Monday have passed, now the more rational Christmas season closeouts begin. As always, I have my own opinions on what represents a bargain. For a personal computer or laptop I recommend a minimum of an intel i5 or AMD Ryzen 3 or A8 or better to support Windows 10 or Mac OS. I know there are a lot of machines out with other processors that may look attractive; but, other designations for processors are either a few years old or quite a bit less capable and thus a poor choice for a personal computer.

Historically, I have addressed some exceptions to the above processor guidance and I will repeat that here because there are some real nice machines that aren’t in my above guidance. These machines are intended for folks who do not need any processing power in their device; folks who want to read email, daily news, maybe respond to an occasional letter and maybe view a spreadsheet. For this kind of application, where all the processing is done on the internet, the whole plethora of machines opens up. Processors like the intel Atom, the N3 series or even the E series processors from AMD are designed for this kind of use and do very nicely.

These “smaller” processors are intended for low power consumption and the resulting devices can be quite small, light and handy to carry. They start to have issues when folks try to do graphics or photo editing, watch high resolution (4k) videos, or any sort of real-time map work. In addition, complicated or large documents and spreadsheets view fine but can be painful (slow responding) to edit on the smaller machines. On the other hand even a minimal battery may provide 7-16 hours of use because they are such low power consuming devices.

Whatever you are looking for, 6-month old designs abound in the marketplace and these are wonderful because they have some history and you can look at reviews and identify the units that have been successful (and the design failures, lemons, can also be identified) and those are the units that will be closeout priced to make room for the new models recently announced.

Turkey? or not Turkey?

I started my reading today with ZDNET’s 2019 Turkeys. This is an annual effort to highlight the failures and disappointments in the technology industry. I must admit that I was unaware of some of the things they highlighted and, honestly, quite disappointed in ZDNET for some of the others they had an issue with, then there were the obvious entries that I think they had every right to show disappointment in.

One of the Turkeys this year revolves around Google purchasing Nest and some related home security and convenience product lines. At the heart of the issue are two problems, Google has taken these products from open architecture (a field of technology and protocol sharing so others could develop products to work with Nest and associated security products) to a closed architecture where only Google will develop and sell compatible products and services. The second problem is that Google has not addressed privacy issues for those using this and other families of their products and has a habit of using the “presumed private” data in some very not private ways.

Another Turkey suggested by ZDNET is Google and its lack of appropriate behavior with large blocks of personal medical records that they gained access to by virtue of that data being managed by them in the cloud. It remains to be seen if legal action will be taken by patients, organizations or even local or national government agencies; but, initially, it appears as if that might be appropriate.

A couple of Turkeys were handed out for a lack of acceptance of the USB-C standard which ZDNET had hoped several brands would embrace this year with their new products. Because the connector is more damage resistant and easier to correctly plugin combined with enhanced power capacity and data rates, it was hoped that it would be widely used on new devices; however, Apple and Samsung appear to have not been ready for that change. Maybe ZDNET does not fully understand the marketing strategies (planned obsolescence and designed in fragility to guarantee more sales?) of these phone manufacturers.

Highlighted in several of the Turkeys this year was the disparity between the presumed privacy of cloud storage and the actual level of protection for private material stored in the cloud. Specific issues appear to be access allowed to law enforcement (without need for a court order or warrant), use of data by the cloud manager for marketing, research, and guidance sold to businesses including insurance firms. It appears that this may have been done within the letter of the privacy statements attached to the sign-up process for obtaining the cloud storage and other services; but, for the consumer, those statements are meaningless without considerable legal assistance and a full understanding of the possibilities for use and misuse of the stored information. In short, the above-average consumer is unable to grant informed consent.

Moving web content?

Has anyone else noticed that web pages don’t hold still anymore? Even with a fairly fast connection, I watch a page load and see the link or button that I want to click and as I click on it, it moves, so I click on something else. Sometimes this is harmless, always it is frustrating, and occasionally it is dangerous. My understanding was that the new version of HTML (now more than 2 years old) would correct this issue; but, my experience is that it is getting worse. I know that part of the blame rests with the browsers and the desire by the authors of the browsers to make it feel like it is loading pages faster; but, allowing the pages to revise as they load to fit on the screen is at the heart of this issue. I am quite willing to allow a few seconds (a very few) for the browser to load enough of the page material so the page loads material in its final position instead of wasting all that video effort on moving things around after the page presentation has started.

New Computer? How do I move in?

When I get a new computer, what can I transfer? What programs or apps must I replace and which ones can I move to the new device? Certainly, all of your documents, photos, and videos can be transferred. Any music which you loaded from your own original media (unless you use Apple Music, in which case you will have to transfer from original again). And, any programs which are licensed to you rather than to the machine may be reloaded on the new machine.

Wait! Programs can be licensed to a machine? Yes, Windows and OSX are generally licensed to a specific machine and use digital serial numbers from the BIOS chips and other hardware identifiers to insure that you do not move them to a new device. Many versions of Microsoft products (office, server modules, etc.) also do this; this is one of the differences between Office 365 and the purchase once versions of Office. There was a short period during which you could uncertify a copy of Microsoft software and then install it on another device; but, that feature seems to have disappeared.

The newer versions of most browsers (Firefox, Chrome, Brave) can synchronize your preferences to the cloud and then be recovered (synchronized) to a new install on the new device. Clear as mud? Programs, unlike documents, require changes to the machine they run on in order to operate correctly. So, productivity suites (like Microsoft Office) and browsers (like Firefox) must be installed rather than transferred. Most applications (those that are not included as part of the operating system) must be installed on the new machine and then your preferences added onto the new device; but, only after the program has been installed.

There are a number of ways to transfer your data (documents, pictures, videos) from an old machine to a new one and each situation may favor a different approach. If the old device no longer is operational, recovery from a backup or synchronized cloud storage may be the best bet though there are ways to extract data directly from an old storage device by removing it from a dead computer and feeding its contents into a newer one.

Whichever technique is appropriate in your situation, it is often helpful to have a practicing consultant assist you with this step.

What should I save?

This week I am rewriting an article from a couple of years back. At issue is what should we save when we buy computers, tablets, phones, computer software, apps, operating systems, etc. First and foremost, save any and all licensing information. This may be a “number” a sticker, or an install code. Should you be concerned when the “things” I recommend aren’t included with your purchase?

The key, from my perspective, is to have everything you will need to reinstall everything when the system or device crashes, or you replace it (new phone every 2 years, new computer every five years?). Perhaps even make a log or list of all the additions you have made to the device. Why do I suggest being so careful with these licenses? Without proof of ownership, you will be put in the position of having to buy it again; very much like paying a second time for something you already have.

Any purchase that includes software should include license information; this may be a number, or code, or a sticker, or certificate that embodies proof of license to use. This is usually the only “thing” that is important to keep (I’ll discuss an exception later) and it is extremely important that this proof be kept. In most cases the media needed to reinstall can be obtained if and when that becomes necessary; but you will need to have that proof of purchase, or license information for the install to succeed.

Fortunately, the license information (whether it is a sticker or a sheet of paper, or a small card) is usually quite small and easily stored (also easily lost); however, this is where many computer owners get in trouble. You really do need to identify and save that original item, or information. If you buy online, it may come in an email (yes print that and save it, yes create a PDF of it and put it with other important documents); such emails really (in my opinion) should be saved as a file on your computer and included in your backups (shove it up to the cloud too). I have also taken the step in some cases of photographing the license certificate and sticking that up in the cloud.
(Important note: just keeping the email is not good enough; it is real common to lose emails over time. Almost guaranteed you will lose the necessary email in the event that you have to reinstall something)

Please note, you don’t own software; at best you own the privilege of limited use of the software you purchase. It is this distinction that leads to what I see as a serious problem currently infecting the computer Industry. What if my purchase did not include any such materials? You buy a computer and there is no sticker attached to it with the license information for Windows or OSX; there is no license information for the productivity suite that came with it (microsoft office for instance), and there are no installation disks included either. Two things; either the software you have acquired is not legitimate or the license information is embedded in the product. Let us assume the latter; in this case you need to immediately make installation or recovery disks (very much like a backup) before there is any opportunity for something to go wrong (if you are lucky, there is a routine for doing this all prepared for you).

One of the ways I choose between computer manufacturers is to look and see if the product comes with restoration media (or original install media) AND license materials; if it doesn’t, I am highly unlikely to make the purchase. In my mind, my having to make the media will cost 3-4 DVD’s and a few hours of my time; a machine that costs $100 more but includes these things may be the better buy.

Even at today’s prices the purchase of a computer, phone, tablet or laptop involves a significant amount of money; please make an informed choice when buying. If in doubt, let Benediktson Computer, Inc. help. It just takes a phone call.

Communication: it means more than just sending.

It is Wednesday again and time for my monthly tirade. When did sending an email to someone or not reaching them on the phone constitute legally apprising them of a contractual issue? There is a reason that email systems have a receipt or receipt requested feature (for that one person in the back, you can request notification that someone has received a specific email from you). Successful communication, to me, means you sent a message AND received acknowledgment of receipt and understanding of the message (ACK and NACK in computerese).


As I said last week, this is the time of year when I am renewing contracts with service providers for some of my clients (and some services I use as well). One of my providers claims to have been trying to get word to me all month (January) that the terms of use for some services I use have changed and some of my sites are not in compliance with the new regulations. Now, understand, they have two phone numbers for me plus an emergency contact number, in case of phone problems, and all my phone numbers have voice mail systems that have been continuously up and fully functional all month in addition to being one of a very few that have my private email address and yet they did not actually get any message to me until Monday; after I pointed out to them that some websites were not working correctly.


I dutifully pulled and scanned the transaction logs for all of my email addresses, and sure enough, they had sent me an email last Friday, that immediately went into the junk folder based on its content and NOT from a recognized addressee. It seems that they have hired a third party to negotiate with grandfathered in clients who have long-standing contracts such as mine that no longer fit with their notion of an ideal client. These folks are really good at coding emails and this one was lovely with some nice graphics, 5 or more links to outside sources, a few phone numbers, etc. In short, the spam filter immediately recognized it as spam and dealt with it appropriately.


So, notes about spam and spam rules. most email clients and some of the better webmail clients have a dual verification system for junk mail (sorry, wanted to use both junk mail and spam because the email clients use those words interchangeably). Part 1 is to check and see if the sender is in your address book (this is not your whitelist, but can function as one if you turn this feature on); if the sender is in your address book, you can have the email allowed even if it contains suspicious content. Part 2 is to score the email based on its content; a number of factors come into play, how many people is it sent to (more recipients means more likely spam) does it contain graphics that are not identified (a photo carefully tagged as company logo or mountain cottage is fine, one with no tag adds to the spam likelihood score)(a graphic tagged “get your viagra here” gets a really high score), does it contain links (lots of links means a high spam score), do the links point to known “No No” sites (guaranteed spam). After this and other factors are considered; a high spam score gets the email treated as spam and a low score allows it to pass into your inbox.


In addition, some email clients (Thunderbird, Outlook webmail, Microsoft Outlook, Google mail) have the ability to learn what kind of emails you consider as spam (junk mail). This feature takes some time and effort on the user’s part but can be very helpful; the intelligence in these features can see similarities when a spammer is changing their outgoing address, recognize similar addresses, servers, businesses, etc. It may take a month or more of the user identifying emails as spam (and as not spam) but I have been impressed with the results once you get enough information into the system.

AI or not AI

In advertisements on TV and in magazines and in papers it is common to see the word AI used. Far too common in my opinion; there was a time when AI referred to artificial intelligence, the ability to learn and the ability to make distinctions in fashions indistinguishable from humans. There was an accepted test (the Turing test) and an accepted way to apply that test to see if the barrier had been breached and a machine actually had achieved Artificial Intelligence.

It seems that that has now all been forgotten and AI is now used in place of what were once called “expert systems”. I have looked at some of what is being advertised, hopeful that AI was real and we could expect some truly exciting innovations; and I have, thus far, been disappointed. Well, disappointed and heartened because one of the fears I have of AI is that as an AI learns it should develop its own distinctions and criteria for making decisions and left unsupervised who knows what could happen.

So far, all of the AI systems I have looked at lack the ability to learn on their own (in a meaningful sense), yes they can use rules they were programmed with to categorize, identify, and act in a preprogrammed fashion. All of that is fine and convenient; add to that the ability to consider blocks of data (history) far greater than any human consciously considers and the results can be quite impressive. I feel that this is the good side of AI; a small portion of what it will take to establish the first AI system, the safe part.

Too many times in my working career I have been called upon to replace or repair, or just audit systems where the humans relying on some computer system no longer understand the criteria used in that system to perform its analyses. Often this system has become an integral part of the function of some business or industry; but, without knowing how it makes choices, how valuable is the data and choices that it recommends?

I don’t know about you; but, I want to know how choices I rely upon are decided. An expert system may consider far more data than a human, and if the accumulation and consideration of that data present a superior potential for a “best choice”, I am all for that. Just so long as I or some reasonable human is clear on the validity of the criteria and the source data being used in the decision-making process. The other elephant in the room for me has to do with an old computing adage “garbage in leads to garbage out”. If faulty information (data) is used in the decision-making process, the decisions made cannot be trusted.

Selecting a new computer – as a tool.

I am continually amazed at how difficult it is to find what I want (or need to meet a client’s needs) in a regular production computer. Not that I am against building a custom computer; but the cost is often higher than buying a prebuilt and making a few key replacements. I know some of it is simply market pressure; I prefer solid state drives to hard drives (spinning magnet platter type) and they are more expensive in the short run. Similarly, the choice of processors in most retail machines makes little sense to me, combine that with the mediocre performance of Intel integrated video (compare to AMD integrated video or the addition of a video card) and it is easy for me to see why so many machines fail the consumer (fail to meet expectations).

Convertable laptop / tablet

Aside from my general preferences (use SSD’s, use an inexpensive video card instead of Intel integrated video or go with an AMD processor and integrated video), the process of selecting a computer, while straight forward, can require some careful thought and planning. All windows or IOS systems rely on graphics but some applications can really benefit from superior video ability. Photographic retouch, photographic editing, video processing, and gaming, for instance, can bog down without appropriate video power. Many drawing and design applications will bog down with insufficient RAM and processor power. So, it really is important to have a good idea of what the computer is going to be asked to do, which applications will be used and how often.

To make the task of identifying what you need in a computer a tad more difficult, the new browsers can be called upon to perform a lot of video processing (google maps, google earth, and a few other online apps are examples) that can really choke a machine that doesn’t have an appropriate video system. Folks who like to have a lot of active tabs in browsers or who like to have several active applications running that they flip back and forth between may notice some bogging down if they don’t have quite a bit of RAM (8gb – 16gb or even more) when just a few years ago 4gb was the limit for anything short of a server or workstation class machine.

So, once again, it is really helpful to determine what you are going to ask of a computer before you head out to select one. Choosing the right components, that work well together and provide adequate performance and resources will make all the difference in a computer.

What must I save with my computer?

Some weeks the article I write comes easily and some weeks I have to mull it over for a few days, start and restart a few times and so forth. This week’s article is of the latter; my aim is clear but how to deliver that aim in a fashion most people will understand is not. At issue is what should we save when we buy computer software, apps, operating systems, etc. Should you be concerned when the “things” I recommend aren’t included with your purchase?

Why do I suggest being so careful with these licenses? Without proof of ownership, you will be put in the position of having to buy it again; very much like paying a second time for something you already have.

Any purchase that includes software should include license information; this may be a number, or code, or a sticker, or certificate that embodies proof of license to use. This is usually the only “thing” that is important to keep (I’ll discuss an exception later) and it is extremely important that this proof be kept. In most cases the media needed to reinstall can be obtained if and when that becomes necessary; but you will need to have that proof of purchase, or license information for the install to succeed.

Fortunately, the license information (whether it is a sticker or a sheet of paper, or a small card) is usually quite small and easily stored (also easily lost); however, this is where many computer owners get in trouble. You really do need to identify and save that original item, or information. If you buy online, it may come in an email (yes print that and save it, yes create a PDF of it and put it with other important documents); such emails really (in my opinion) should be saved as a file on your computer and included in your backups (shove it up to the cloud too). I have also taken the step in some cases of photographing the license certificate and sticking that up in the cloud.
(Important note: just keeping the email is not good enough; it is real common to lose emails over time. Almost guaranteed you will lose the necessary email in the event that you have to reinstall something)

Please note, you don’t own software; at best you own the privilege of limited use of the software you purchase. It is this distinction that leads to what I see as a serious problem currently infecting the computer Industry. What if my purchase did not include any such materials? You buy a computer and there is no sticker attached to it with the license information for Windows or OSX; there is no license information for the productivity suite that came with it (microsoft office for instance), and there are no installation disks included either. Two things; either the software you have acquired is not legitimate or the license information is embedded in the product. Let us assume the latter; in this case you need to immediately make installation or recovery disks (very much like a backup) before there is any opportunity for something to go wrong (if you are lucky, there is a routine for doing this all prepared for you).

One of the ways I choose between computer manufacturers is to look and see if the product comes with restoration media (or original install media) AND license materials; if it doesn’t, I am highly unlikely to make the purchase. In my mind, my having to make the media will cost 3-4 DVD’s and a few hours of my time; a machine that costs $100 more but includes these things is the better buy.

Even at today’s prices the purchase of a computer or laptop involves a significant amount of money; please make an informed choice when buying. If in doubt, let us help. It just takes a phone call.

1 phone, 2 numbers?

Another week and another challenge. The desire to have a cell phone answer calls for 2 numbers came up this week. It isn’t a new problem for us; we provide service to folks in several different parts of the United States and have used a VOIP (Voice Over Internet Protocol) solution for a while to make our phone a local call for our clients.

This week my choice of antiquated phone services caught up with me and the local cellular providers were unable to port in my phone number to a new phone; the solution was to get a new number, but, my clients are all accustomed to my old phone number. So, I went searching for a solution that would let my clients call the number they know and reach my new phone (which just happens to have a new number).

The cell company wasn’t going to provide a solution because they couldn’t find a way to bring in the one thing that was important to me; that phone number. I found 3 solutions (I am currently implementing 2 of them) and they were all quite cost effective. The first solution was to port my old number into a VOIP system and set it to forward to the new number. It turns out that the VOIP people are accustomed to dealing with older phone services and were able (with a 5 day delay) to port in the old number and I setup two (2) call forwards to the new number so my calls would arrive at the new phone and number regardless of where I was in the porting process.

I also found two other solutions; one with Google Voice. Google voice is an interesting service; it requires that you have a google account (these can be acquired at no charge) and provides an answering service for the chosen phone number. This answering service can either take a message or forward the call to your choice of cell phone numbers or land line numbers (does anyone remember land lines, phones connected to a wire?). There are a couple of things to know before we all jump out and sign up for this service; first, porting in an existing number costs $20.00 (a one time cost) and second, this service answers calls like an electronic secretary. That is, it takes the call and asks for the person to state their name before forwarding the call. Based on how the calling person answers, it will then forward the call to all numbers it has been programmed to forward to or takes a message from the caller. If it takes a message, it can email that message, or alert you to check for the message on your google account or a few other options.

Another Service I found (that had good reviews) was Sideline; Sideline is a tad more transparent than Google (no robot secretary asking for the caller’s name) and may require a $0.99 fee each month (no fee depending on which features you use) but can also port in your well known phone number and forward it to your choice of phone numbers. The one gotcha involved with sideline is that your original account number must stay with sideline (so don’t use your important well known number to setup the account unless you intend to leave it connected to Sideline). When all is setup, the caller calls your number, it is automatically forwarded to your chosen destination number and the sound quality is like any other call; the only difference is that the incoming caller id indicates that it is a forward.

In any case, you can, with these services, setup one cell phone to answer 2 or more numbers and go about your daily activities only needing to carry one phone. Naturally, each of these services can do other things for you and which service you choose might depend on which one has additional features of interest or need for your business. If you have questions or want assistance with other technology issues please call Benediktson Computer, Inc. At (575) 956-9732 or email to help@benediktson.com.

Dealing with slowing networks

Sometimes troubleshooting internet issues leads you through unusual twists and turns and sometimes it is just about checking on the basics. In the last week I have been asked to fix connectivity for two different businesses. When I hear that request my curiosity gets the better of me; I really need to find out the answer to “what is leading these people to ask for this service?”. I confess that each case is unique but it usually revolves around network or internet speed issues.

A quick history of networks and the internet might be helpful here. Way back when, in the age of dial-up service, we used coax cable in various topologies (rings, stars, chains, etc.) and networks transmitted data at amazing speeds of 20k (thousand bits per second) all the way up to 2m (2 million bits per second); this was okay because printers could produce up to 30 characters per second and modems (dial-up devices used to bring internet access to your network or a single machine) were capable 120 baud (12 words per second = approximately 120 bits per second) , 300 baud, 1200 baud (a fairly common speed at one point) all the way up to 9600 baud (realistically somewhere between 480 and 960 characters per second). As networking matured, standards like 10 base 2 and 10 base T took hold and networking speeds outstripped the speed that machines could take in data. Now we use networking based on 100 base T (fast ethernet) and 1000 base T (gigabit ethernet – 1 billion bits per second with one character requiring 10 bits) running on fiber optic or category 5 cable (cat5, cat5e, cat6); and, just for completeness, our modern networks are similar to the old star networks.

Up until recently, the new network speeds were significantly faster than anyone had a use for. Up until recently, most emails were very small (a few hundred up to a few thousand characters at most). Now we have several common technologies that require HUGE amounts of data; digital photographs can be made up of 3 million to 40 million bytes (8 bits in a byte) or more, videos may consume 3 billion or more bytes per hour of video and documents that once averaged 800 bytes per page can now (because they may include photos, graphs, drawings, videos, etc.) use more than 1 MB (mega byte, 1 million characters) per page. The result is that it is now possible for a small office to experience congestion on their network (like traffic at rush hour; everything slows down and may seem to come to a complete stop). To exacerbate this issue we also have a lot more devices that may be using the network and each device can have all kinds of helpful and cherished utilities running that depend on network and internet support.

It is this new tendency to consume great amounts of data and send it over the local network or out into the internet that is a primary cause of network congestion. What I see, all too often, is that most of this is unintended traffic; what I mean by this is that the folks using all of this stuff (bandwidth, or transmission potential) have no idea that they are consuming (demanding) all of this data or any data at all. At least the driver of a car who gets on a main arterial at rush hour has some understanding that they represent one small part of the traffic congestion; computer users who listen to internet radio on their phone (just one of hundreds of examples) while at work have no idea (most of them anyway) that they are using up a measurable percentage of the total internet bandwidth available to the office network (and they are adding to general network congestion on the local network also).

So, how does this affect you? I looked at a home network last week because the client complained of not being able to print on occasion and wanted me to fix the network connectivity. My first look revealed that all was well with the equipment, and that lot of data was being successfully moved by the network. After a few questions about what was included in the network, what it was supposed to do and so on, I realized that I was seeing way too much network traffic for those few pieces of equipment and the tasks that they were intended for. This can have several causes and I attempted to verify or rule out a few of the most common. A damaged network adapter or damaged router (the device that handles routing information from the internet to all devices on a network and between devices on a network) can cause this kind of extreme traffic buildup. After a couple tests and component replacements this was ruled out; so on to seeing if the traffic was intended traffic and just far greater than the client (or I) suspected; but I was able to rule that out with a few more tests. I then asked the router (to be truthful I started this early on but let it run for a while and then checked on it at this later stage) where the traffic was going and what was the nature of the traffic. It was at this point that we (the client and I) learned that several smart phones (in the house, innocently being used by household members) were responsible for 95% of the traffic on the network and were keeping the network at 95-100% of capacity. The solution in this specific case was simple and some were happy.
In another case, this week, I had a similar request and this time I found that a computer had been set to back itself up to the cloud, daily. Whenever this computer had some idle time it continued the process of backing itself up and the internet appeared to completely stop for everyone else in the office. Adjustments were made to the backup policies on that computer (part of many of the better on line backup services) and it was set to only backup things that changed and the problem disappeared.

When your internet performance suddenly changes for the worse (who complains when it suddenly gets better?), it can be an issue with your provider or it can be a change in your habits, it can be a sign of viral or malware activity, or it can be some unintended traffic plugging everything up. It can also be the result of interference with wireless transmissions, damage to a network cable or interface, etc.
What ever the cause or causes, it can be very frustrating and it may be one of those problems you look to a consultant or network professional for help with; this is another case where practice really speeds up the process of identifying and fixing the problem.

Updates and Security

Next week I will start a 3 or 4 part series of articles on how to get the most out of your website as an advertising tool; however, this week I am focusing on updates and security concerns.
I get a skewed view of personal and professional computers because I see them when they are having trouble. The ones that don’t have any issues aren’t brought to my attention. Regardless of whether your computer represents a delicious target for hacking or contains nothing of interest for anyone or anything else, you have a fairly similar risk of being hit with a virus, malware or even a directed hacking attack. As a result, the usual tactic of abstaining from updates until you have an issue may not be the best choice (if it isn’t broken, then don’t fix it is more like sticking your head in the sand).  Yes, I share the pain of the almost daily updates and the disruption they can provide.  Some updates simply cause the machine to take a long time to start up (while I wait to use it when I was, naturally, in a hurry) and some disable “features” that I or some software package I use depend on; and still others reveal sloppy programming on the part of the authors of some important (to me or the user) product.  All of this and more are real frustrations associated with “updates”.
 
The complexity of modern operating systems, whether it be IOS, OSX, Windows, Android, or Unix (linux), has created a need for constant patching and updating to stay ahead of the virus and malware industry. The update may not seem to be important to what you do on your computer; but, odds are it, or an update that relies on it, are quite likely to improve your computer’s odds of rejecting some malicious code your machine encounters. If you ignore the updates (or reject them), you are leaving the door (your computer) wide open.  Because of the variety of potential attack vectors, the simple anti-virus packages that were adequate a decade ago really aren’t adequate protection today. Today, you need a good firewall, a continuously vigilant anti- virus and anti-malware pair or package and one or more auxiliary on demand protection packages.
 
Even the best Anti-virus and anti-malware products combined with a good firewall and up to date operating system (with all patches, etc.) will occasionally suffer an intrusion or infection; this is where an auxiliary package becomes important. These are usually stand alone products (scanners, security helpers, etc.) that can be run on demand when you suspect an issue or just prophylactically. I always choose one from a trusted vendor that is unrelated to the other products that routinely maintain the computer and its security. Each provider seems to have their own specialty and, occasionally, spotting an issue and removing or correcting it is the result of choosing a tool from the right provider. Since I don’t keep up on who’s tools are best at which problems, I often resort to trying a couple until the problem is resolved.
 
Choosing your security tools is very important because not all security products are on the up and up; some are traps designed to get your system infected while others are more aimed at gathering information for some advertiser than protecting your security. Often I will run into an advertisement or popup claiming to be the next super tool to fix various issues or protect me from all threats; most of these turn out to not be what they claim and instead make things worse. This is where I have a luxury most computer users don’t have; I have a sacrificial computer that I can try these products on and learn about them. If they turn out to be trouble or make a mess out of my computer, I can simply reformat and reload Windows or Linux and be right back where I started.
 
So, what can the everyday computer user do? First, I recommend going with the tools built into your operating system; most have adequate firewall products and these form your first line of defense. If you want something stronger to protect several devices, I recommend a firewall appliance (a small box that goes between your devices and the internet and filters your traffic before it gets to your devices). Next, a good combination anti-virus and anti-malware product (windows 10 comes with Windows Defender which is adequate for most situations but several companies including Sophos, Norton, McAfee, and Trend Micro also provide adequate products.). Third, I recommend products like the demand scanners from any of the above mentioned companies (most of these are available on their websites at no charge) in combination with products from LavaSoft, Malwarebytes.org, Piriform.com, Auslogics.com, and a host of others to try and spot anything that sneaks through the regular lines of defense. I want to be clear, at this point, that it is extremely important to have good recent backups (more than just one or two and preferably in a series going back days and weeks in case the bug(s) have impacted data back that far.); often, recovery from an infection will require restoring some data from a time prior to initial infection (backup media is extremely inexpensive and software to perform automatic backups is inexpensive or included with your operating system or security system).
 
As with many other things about your computing devices, if you have valuable information at risk, please consider contacting a professional before you have issues to help you setup appropriate defenses and maintenance routines, and once you believe you have an issue. The professional is likely to encounter these kinds of problems daily and have lots of practice resolving them (greatly improving the odds of good outcomes; and it may not take them near as much time to get you back fully functional).

SSD Reliability

SSDs are a new phenomenon in the datacenter. We have theories about how they should perform, but until now, little data. That’s just changed.

The FAST 2016 paper Flash Reliability in Production: The Expected and the Unexpected, (the paper is not available online until Friday) by Professor Bianca Schroeder of the University of Toronto, and Raghav Lagisetty and Arif Merchant of Google, covers:

  • Millions of drive days over 6 years
  • 10 different drive models
  • 3 different flash types: MLC, eMLC and SLC
  • Enterprise and consumer drives

Key conclusions

  • Ignore Uncorrectable Bit Error Rate (UBER) specs. A meaningless number.
  • Good news: Raw Bit Error Rate (RBER) increases slower than expected from wearout and is not correlated with UBER or other failures.
  • High-end SLC drives are no more reliable that MLC drives.
  • Bad news: SSDs fail at a lower rate than disks, but UBER rate is higher (see below for what this means).
  • SSD age, not usage, affects reliability.
  • Bad blocks in new SSDs are common, and drives with a large number of bad blocks are much more likely to lose hundreds of other blocks, most likely due to die or chip failure.
  • 30-80 percent of SSDs develop at least one bad block and 2-7 percent develop at least one bad chip in the first four years of deployment.

The Storage Bits take

Two standout conclusions from the study. First, that MLC drives are as reliable as the more costly SLC “enteprise” drives. This mirrors hard drive experience, where consumer SATA drives have been found to be as reliable as expensive SAS and Fibre Channel drives.

One of the major reasons that “enterprise” SSDs are more expensive is due to greater over-provisioning. SSDs are over-provisioned for two main reasons: to allow for ample bad block replacement caused by flash wearout; and, to ensure that garbage collection does not cause write slowdowns.

The paper’s second major conclusion, that age, not use, correlates with increasing error rates, means that over-provisioning for fear of flash wearout is not needed. None of the drives in the study came anywhere near their write limits, even the 3,000 writes specified for the MLC drives.

But it isn’t all good news. SSD UBER rates are higher than disk rates, which means that backing up SSDs is even more important than it is with disks. The SSD is less likely to fail during its normal life, but more likely to lose data