Blown to Bits by Hal Abelson, Ken Ledeen, and Harry Lewis indicates web sites are cheaper and easier to make which leads to constant web variations. What does this mean for the Internet? Will there be a decrease in the amount of reliable sources? How do we truly know whose sources are the most reliable? This text claims “editors decide what went in each category, and what got left out.” How do we know that “non-endorsed” sources are not the most trustworthy or accurate? Should the people or majority have the right to control search engines? Should there be qualifications to post on the web since there is a lack of structure (according to the text)?
Since ideas on the web are continuing to increase, should educational facilities influence students to create their own theories, rather than utilize the theories of others for credibility?
Why is Google one of the most highly recommended search engines? Is it the appearance and themes, or search results? Who determines what search engines we utilize? In regards to limited search engines, should the people have a right to control and have easier access to creating and promoting new engines?
When our class was searching for guitar cases on amazon, why is it that data became limited after we searched for cases in all departments?
According to many individuals I have spoken with, hashtagging must be precise and limited in order for the best results and networking experiences. I wonder why there is an emphasis placed on not using many hashtags. According to a YouTube video titled “What Is a Hashtag?,” hastagging is a way for one to become connecting with others of similar interests. However, why is one criticized for using many tags? After all, would this not result in much more exposure? As long as the tags are relative to the topic, I do not see the problem. One website labeled “howtohashtag.com” claims that too many tags annoy fans and look like spam. What if powerful individuals are trying to limit our ability to build our networking block?