Well, as you can see the search engines are displaying more and more results that are related to the most recent items in their News search.
How many times have you been researching something and you think you’ve found what you are looking for only to find out that the article you are reading is from 2 years ago.. Grrrrr and you thought that vacation spot was nice. The review you were reading was from 2 months prior to the ‘flood’ they experienced and that review after that is lost somewhere in cyberspace.
Herein is the issue. How do you index this kind of data in a way that cant be manipulated.
If you tell people certain elements to inclue peopel will lie.
If you dont, what will be the standard? when goog gets it? the most trusted sites keep pumping out the data that gets read first?
How does this work if some nobody site actually reveals something very important but their site has very few links. These are the challenges.
I think when it all boils down it will continue to evolve but the ‘bones’ will always stay the same. Links and link appraisels will rule and they will always be tested.. but as long as humans are involved in the process manipulators will be part of the mix and honestly this keeps it all healthy because letting *goog* have their way exactly would not be a good thing at all now would it. We’ll be paying $50 a click one day to research how to clean up dog pee from our carpet.
Comments
Post a Comment