Google has been dominating the search engine industry over the years, though it has been frequently criticized of not providing search results in non-English languages. To cater to the problem, it has resorted to semantic indexing thereby becoming proficient at providing multilingual search results. The spectrum of search contents have been widening with time thus hinting at an expanding and trending macro environment. The search engines use algorithms which are solely based on Artificial Intelligence which would be rather simpler with limited pre-defined inputs. In its quest to understand the true meaning of different search queries, the algorithms are required to understand the contextual meaning behind various pairs of words which is attributable to deep learning. Despite capturing 70% of the search engine market globally, certain discrepancies arise due to regulatory policies. However, according to Shout Agency, the core problem is not the structure of algorithms as Google can make educated assumptions indexing any language but discrepancies in search results persist. The crux of the matter entirely stems from the fact that Google has had limited opportunities to conduct deep learning in some language than others. A potential risk is involved due to smaller user base and fewer Google employees that can understand the language enough to determine the worth of the content which lowers the chance of Google to conduct manual penalties for content. This could lead to greater pervasiveness of spun content throwing away algorithms dependent on deep learning.

Read more at:  https://www.smartdatacollective.com/google-search-algorithms-use-big-data-multilingual-latent-semantic-indexing/