Dedicated Server

A dedicated hosting service, dedicated server, or managed hosting service is a type of Internet hosting in which the client leases an entire server not shared with anyone. This is more flexible than shared hosting, as organizations have full control over the server(s), including choice of operating system.

Managed Dedicated Server

Managed dedicated server To date, no industry standards have been set to clearly define the management role of dedicated server providers. What this means is that each provider will use industry standard terms, but each provider will define them differently. For some dedicated server providers.

SQL ServerCompact Edition

The compact edition is an embedded database engine. Unlike the other editions of SQL Server, the SQL CE engine is based on SQL Mobile (initially designed for use with hand-held devices) and does not share the same binaries.

SQl Server Architecture

When writing code for SQL CLR, data stored in SQL Server databases can be accessed using the ADO.NET APIs like any other managed application that accesses SQL Server data.

Bandwidth and Connectivity

Bandwidth refers to the data transfer rate or the amount of data that can be carried from one point to another in a given time period (usually a second) and is often represented in bits (of data) per second (bit/s).

Wednesday, October 26, 2011

Oracle, Dell, EMC and VMware want it to be in the server


There is a battle going on behind the scenes over the location of storage's soul: the controller hardware and software. Oracle, Dell, EMC and VMware want it to be in the server, while NetApp and HDS want it to be in the array, an array operating with servers but distinct from them.

The picture is not as clear-cut as this on the surface – NetApp is working with Oracle for example – but this is my take on what is happening down in the development depths, among the strategists and engineers with multi-year product horizons.

The modern storage industry, the one shipping networked external storage arrays, has been built on two foundations. One is EMC's establishment of a market for third-party external, block-addressed storage arrays distinct from the server suppliers of the time: HP, IBM, Digital Equipment, etc.

The other was the invention and establishment of file-addressed network-attached storage (NAS or filers). NetApp is the single most effective proponent of that, although EMC grew to ship more filers than NetApp. EMC and NetApp represent the twin peaks of the external storage array.

A storage array comes in two flavours. It is either monolithic, with multiple controllers or engines and some fancy interconnect hardware to link these to the storage shelves – think Symmetrix, latterly VMAX – or modular. Modular arrays have two controllers linked – by simpler Fibre Channel or latterly SAS – to the storage shelves. NetApp's FAS arrays and EMC's CLARiiON are classic embodiments of this idea.

Applications in servers sent SCSI block requests or file access requests to these arrays, which presented themselves, logically, as a single pool of storage, separated into dedicated logical disks (LUNs) for the server apps, or sharable filestores.

Andy Bechtolsheim
This long-lived storage concept is now being discarded, and the first nail in its coffin came from Sun and the inventive Mr Andy Bechtolsheim.
Honeycomb upsets the storage hive

Bechtolsheim's idea was that co-locating servers and storage in the same overall enclosure would speed server apps dependent on lots of stored data. Thumper, a server-rich NAS device delivered as the X4500, was one result of this and Honeycomb another.

Neither set the world on fire but they did show the way to getting more data into servers faster. Then Oracle bought Sun in 2009 and suddenly Bechtolsheim's idea got a rocket boost from the Exadata product, a set of server resources running Oracle software with their own storage resources. This is setting the Oracle World on fire, with much encouragement from Oracle marketing because its own bunch of modular arrays was pretty second-rate.

Exadata Database Machine
What Sun invented and Oracle extended is the NoSAN server. EMC has seen this idea and responded by devising an opposite of this, the No-Server SAN, a kind of reverse engineering in its way.
EMC brings the servers to the array

EMC is trying to have it both ways. VMAX, VNX and Isilon arrays are going to be able to run application software in server engines in the array controller complex. There is a natural fit with VMware's ESX running the whole shebang and VMs being loaded to run storage controller software and applications that benefit from low-latency access to buckets of data. These array-located app servers use the array's own internal network or fabric, VMAX's Virtual Matrix for example, instead of the normal Ethernet or Fibre Channel fabric. This isn't SAN access as we know it.

EMC also has its Project Lightning to have its arrays manage the loading and running of flash caches in servers, PCIe-connected flash. That's a step on the road along which Dell appears to be further along. The Round Rock company is also going to build servers with flash, but this is a storage tier and not cache. This tier zero storage is logically part of the entire array controller-managed storage pool with automatic data movement.

Nvidia is looking to push future Tegra chips into servers



Looking beyond graphics processors, Nvidia is looking to push future Tegra chips into servers as the chip maker tries to break Intel's dominance in that market.

Nvidia is developing its first CPU for PCs and servers, code-named Project Denver, which is based on the ARM architecture and also aimed at mobile devices. The Denver core will go into future Tegra chips, and special improvements will be made to server chips, said Steve Scott, chief technology officer of Nvidia's Tesla product line of enterprise graphics chips.

"There are some things we are doing that are particularly nice for our purposes. It will likely go into the Tesla line at some point," Scott said.

Nvidia's current presence in servers is mostly related to its Tesla graphics processors, which are being used in the world's fastest supercomputers to perform complex scientific and math calculations. The Oak Ridge National Laboratory is building a supercomputer called Titan that will include Nvidia's Tesla processors and Advanced Micro Devices' 16-core Opteron CPUs to deliver a peak performance of up to 20 petaflops. The fastest supercomputer is Japan's K, which delivers a performance of 8 petaflops.

Scott did not share specific details on how Nvidia would tweak future Tegra chips for servers. However, the company has said that Project Denver chips will harness the parallel processing capabilities of Nvidia GPUs with ARM CPUs, which could boost server performance.

Most servers today run on Intel's Xeon and AMD's Opteron chips, but there is growing interest in low-power ARM processors as companies look to cut electricity bills. Analysts have said that while ARM processors may lack the performance and reliability to overtake traditional server chips for critical tasks, a large collection of lightweight ARM cores could process high volumes of Web-based transactions while drawing less power.

Running complex calculations by harnessing the parallel processing capabilities of CPUs and GPUs can speed up servers while reducing overall power consumption and computing overhead, Scott said. Nvidia is already building graphics cores in current Tegra processors.

"The ARM instruction set is more power efficient than x86. That's why there are people looking to build ARM-based servers. That's why we like ARM in phones, because you get more performance per watt, more performance per square millimeter," Scott said.

It makes sense for Nvidia to push its Tegra chips into the server market, which has higher margins than mobile devices, said Dean McCarron, principal analyst at Mercury Research.

"They have some interesting parallel processing technology that works out for them, and they have ARM, which makes sense for them to pair to go after that class of applications," McCarron said.

Nvidia's target market for server chips could be GPU-dependent systems delivering graphics or mathematical rendering in the cloud, McCarron said. ARM processors are not as proficient as GPUs in performing complex calculations, so Nvidia could end up making trade-offs on its ARM CPU design on power to bring in more performance.

"There are evolutionary pressures that drives you when you are going after servers compared to handhelds," McCarron said.

Companies like SeaMicro and Dell are building servers based on Intel's low-power Atom processors, but Nvidia's entry could fuel more interest in ARM servers. Nvidia's competitors will be Marvell, which last year announced a 1.6GHz quad-core ARM-based server chip, and Calxeda, which has built a server chip based on a quad-core ARM processor.

A big hurdle to entering the server market for ARM is software compatibility, as most of data-center code is written for x86 servers. A lot of IT implementations require corresponding server- and client-level compatibility, but x86 binary compatibility is less of a concern for Nvidia's future server chips delivering cloud services, Scott said.

"In the back room, in the cloud, binary compatibility doesn't matter nearly so much either," Scott said. "They are providing a service over the Web and they can switch to ARM, that is more power efficient."

The software stack is less of a worry on the server side than it is on the client side, where they could be issues around compatibility, McCarron said.

"As a user of a [cloud] service, the instruction set is meaningless. On the cloud side having to provide the service, that's where the investment comes in," McCarron said.

ASP.NET Assaults Strike Over 1m Websites

Google discloses a bulk contamination spree that disrupted innumerable websites using malware, which attacked ASP.NET or ASP Web-application protocols, published Softpedia dated October 13, 2011.

Essentially, more than 600,000 websites were affected when an SQL-insertion assault targeted ASP.NET sites. And during the period when Armorize publicized this finding, merely 6 security firms from a total of 43 managed to identify the malicious program.

Apparently, the contamination involves a code insertion inside online sites that hospitals, restaurants along with other small-sized companies operate as well as implanting of a web-link, that's invisible, into users' web-browsers that lead onto sites like nbnjkl.com as well as jjghui.com.

These sites too then divert onto many other sites like www2.safetosecurity.rr.nu and www3.strongdefenseiz.in, which have concealed malware for abusing known security flaws within Java, Flash or PDF of Adobe.

Disturbingly, internauts having expired components in browsers become immediately contaminated whilst they access any from among the hijacked websites. This is devoid of even their perception of the cause of the attack as also despite the drive-by assault apparently hitting just those Internet sites that rely on the aforesaid protocols.

The hijacked websites are falsely named "James Northone" during registration that's the identical bogus name utilized during the LizaMoon assaults of April 2011.

Specifically according to Securi a security firm, the registration details related to the URLs utilized within the assault in question exactly represent those utilized for the previous LizaMoon URLs. Consequently, LizaMoon assaults' impact on about 1.5m weakened websites was the same when malware on those websites diverted visitors onto BHSEO-poisoned sites that served malicious payloads.

Luckily, Australian websites mainly remained free from infection during both assaults.

Meanwhile, security specialists say that for resident Internet operators, who've safeguard measures installed, should remain secured since when some anti-virus agencies become aware of a malware, others will closely follow as well as fast blacklist the malware learnt as causing destruction.

But, incase end-users haven't still completed the installations, they must speedily make their browsers up-to-date along with Java and Adobe Flash, since often, attackers exploit the flaws within the previous editions for delivering such several drive-by threats related to browsers.

Sunday, October 9, 2011

Five Features To Witness On A Cloud Hosting Control Panel

The ability to access information from anywhere and with different devices is a great benefit of cloud hosting. The control panel, however, needs to have a few basic features to get the most out of the service. In particular, five features should be on a cloud server hosting cpanel.

When working with cloud computing, an individual or a business needs to be able to access, maintain, and alter the files on the virtual system. The ability to track who has been on the service, and files that have been changed helps administrators keep data intact and secure. Email services should be monitored and maintained from the virtual network. In addition, keeping the server secure needs to be accessible from the main control panel. Video and Audio Streaming is another great feature as it enables users to host and watch videos or play games on the network.

Security is a very important feature on a virtual server. When data is kept on the Internet, it is vulnerable to viruses and hackers. The ability to block certain IP addresses, allow or block users, and keep password directories in a secure location helps reduce the threat of having the wrong people access private information. No one likes to have a computer crash and take all of their information with it. A virtual system kept on the Internet provides users the ability to keep important files in another location. They can also start a document in on place and complete it somewhere else without having to worry about transferring it. Monitoring all of this data and making sure that all the files are accurate is a great feature on a control interface.

Video and audio streaming is another one of the five features that should be on a cloud server hosting cpanel. Sending grandma a video of a birthday party, creating a training video to help a friend learn a new software, or keeping recorded notes are just some of the things people could do with this application. Moreover, with the data transfer speed cloud hosting is known for, the videos can be seen without waiting for them to load.

Logs help administrators and users keep track of who has visited the system recently and what they did while they are there. It also keeps track of the storage space and bandwidth being used so that it is possible to know that the service package is sufficient. It is also possible to look at any errors that may have occurred to know if data was lost. Email is a great feature to have on a web control system because this allows people to read their email no matter where they are. They can also block spam and change passwords to keep people from accessing the emails. This a great feature for students, individuals, and employees.

A cloud server panel needs to be able to provide flexibility and control for the user. Video streaming, mail, files, logs, and security are five features that help users. They keep data safe, accurate, and up to date and make the virtual network user friendly.

Twitter Delicious Facebook Digg Stumbleupon Favorites More