Page 54
Alternatives for Free Cloud Storage in China
Posted on May 14th, 2012 by Lily Grozeva in TechChina has a long-running practice of censoring and restricting access to foreign services, and cloud storage is no exception. Google Drive is the last service to hit the Golden Shield. It is hard to believe that Google will find solace in the fact that most of the popular cloud storage services are also restricted in China. Five hundred million active internet users are off-limits, while cloud storage is on the rise. There are already quite a lot of great service providers out there. The number of services suggests that there is rising demand. Cloud storage for the masses is still a new and exciting thing.
DNS Tools
Posted on February 28th, 2012 by Victoria Pal in ToolsSometimes the little things can get your online business in trouble. Each time someone comes to your website they go through a Domain Name Server, part of the Domain Name System. The Domain Name System makes it possible to assign domain names to groups of Internet resources and users in a meaningful way, independent of each entity's physical location. Internet domain names are easier to remember than raw IPv4 and especially IPv6 addresses.
We provide several free DNS Tools to help out with some important questions:
From AC to DC
Posted on February 23rd, 2012 by Victoria Pal in Tech“Sometimes you need to look back to see ahead”. I never thought this saying will go well with data centers. It seems the new big thing with data centers is the transition from AC to DC architecture. DC took the back seat over a century ago and gave way to AC. So why should we revert to DC? We live in an AC world and things seem okay. Why the change?
How the Great Firewall of China Works
Posted on January 30th, 2012 by Victoria Pal inChina with its largest population of web users in the world, has one of the most restricted internets, making sure that netizens cannot post nor read about information the government deems threatening.
What Does Robots.txt Do?
Posted on December 28th, 2011 by Victoria Pal in TechThe robots.txt file simply contains instructions for search engine robots on what to do with a particular website. While the search engine robots follow the instructions from that file, spam bots simply ignore it in most cases.
A web robot is a program that checks the content of a web page. If a robot is about to crawl a website, it will first check the robots.txt file for instructions. A command “Disallow”, for example, tells the robot not to visit a given set of pages on this site. Web administrators use this file to restrict the bots to index the content of a particular website for different reasons - they do not want the content to be accessible by other users; the website is under construction, or a certain part of the content must be hidden from the public.