SQL Server – SQL INSERT Stored Procedure #stored #procedure, #sql #server,


SQL Server Stored Procedure – INSERT – Example

Source code to create and add sql insert stored procedure to catalog

The following example is for creating a simple insert stored procedure. You can run it through an explicit call from a host language program or directly from a DBMS query execution shell like SQL Server Management Studio or dbOrchestra.

You will note that just like in a INSERT statement you do not have to use all of the columns available when creating a stored procedure. You must however populate all columnar data associated with the PK (primary key), and columns associated with unique indexes (note: there are exceptions to this, but they will not addressed here), and columns defined in the ddl as NOT NULL .

Executing the sql insert stored procedure

Execute insert stored procedure

To run the insert stored procedure you need to supply a value to the student_id variable as well as populate all required data. In this example I have included a cross section of columns for your reference, as well as the datatype associated with the columns in our SQL INSERT.

A few thing to point out regarding the code above. Datetime and character data is entered using single quoted string. Integer data is entered without quotes. I also included all columns that were defined as NOT NULL. I also included the password column which is part of a unique index. You will note that I did not include columns for some of the non-unique indexes (soc_sec_num, other_id_num, and driver_license_num). I did this intentionally to demonstrate that they are not required columns. Having said this, in the real world one would only put indexes on columns where the intent was to actually collect the data for that column. I just wanted to make a technical point here. The pragmatic point is that you would want to expose columns that are part of indexes.

What stored row looks like after the SQL INSERT

I want to call you attention to the fact that all columns that are not referenced in the query get set to null. Also, if the schema had default values defined these would also get stored for that column when the rows gets inserted.

SQL Server Information and Resources


Posted In: NEWS

Tags: , , , , ,

Leave a Comment

Business Voice, VPS, Colocation – Internet Connectivity Service Seattle, Bellevue –


Home Personal Services

We offer a wide range of access options and services to residential customers and we have state of the art equipment and technology to provide fast and reliable connections for our customers. Our expert support staff is available to assist our residential customers with professional service to help with anything from getting connected to troubleshooting.

Residential Connectivity Services:

  • Web Page Hosting with several options available for your personal page.
  • DSL Internet Services featuring exceptionally fast speeds
  • Dial-Up Internet provides an affordable solution for residential internet connections
  • TrueRing Home Phone with low prices and no set up fees

Digital Home Phone

Unlimited Calling only 24.99 /month

  • Keep your current number
  • Use your existing phone
  • No computer needed

Personal Web Hosting

Less than 3 a Month!

  • Unlimited space, bandwidth databases
  • Unlimited e-mail
  • FREE domain name

If you are looking for VPS. business internet connectivity, or other services for your home or business in Bellevue, Redmond, or Seattle, you will find additional information about our residential and commercial services on our website.

1994 2017, ISOMEDIA Inc.

12842 Interurban Ave S, Seattle, Washington 98168


Posted In: NEWS

Tags: , , , , , , , , , , , , , , , , , , , ,

Leave a Comment

Mac Mail not connecting to Exchange Server – Part 1 #mail


Mac Mail not connecting to Exchange Server – Part 1

Microsoft Outlook 2013/2016 clients on PC did not exhibit any issues. Latest Microsoft Outlook for Mac did not exhibit any issues either!

It is worth pointing out that while Outlook 2013/2016 on PC use MAPI/HTTP or RPC/HTTP to connect to Exchange 2013/2016 servers, both clients on Mac (Mac‘s own Mail client and Microsoft Outlook for Mac) use EWS to connect.

Mac Mail would show “Account Error” which read “Unable to connect … The server returned error: Connections to host mail.contoso.com on default ports failed.”

The error is somewhat misleading as when you look at EWS logs on Exchange server, you see that the server rejected the request with HTTP error 400 which translated to “bad request”. What I didn’t know at that point is which element in the HTTP request was causing the server to reject the request.

Microsoft provides great testing tools for Exchange on www.exrca.com. so I decided to run EWS connectivity test from the toolset. To my surprise, the test failed with the error “Message: The request failed. The remote server returned an error: (501) Not Implemented.”

I knew about this particular error as I had worked with Michael Van Horenbeeck and is documented here:

After enabling L7 debug tracing on my KEMP LoadMaster I observed the same error mentioned in the above articles “lb100 kernel: L7: badrequest-client_read [] (-501): ?xml. 0 [hlen 1405, nhdrs 11]”

It would be easy to blame load balancer at this point and implement the recommended fix by changing 100-continue handling on LoadMaster. But if that was the issue, why did Outlook for Mac work without any errors?

Anyhow, I proceeded changing the 100-continue handling by changing it from default setting of RFC-2616 compliant, to RFC-7231 compliant instead:

After changing this setting, I ran the EWS test from exrca website again, and as one would expect, the test passed with no issues.

It did, however, failed to address the issue for Mac Mail client! No eureka moment, just yet!

I will cover details of further troubleshooting in second part of this article. You can read it here.

Featured Links


Posted In: NEWS

Tags: , ,

Leave a Comment

Index page, unreal media server.#Unreal #media #server




This iteration of UnrealSP.org has proven difficult to maintain. It will eventually be replaced by a new version, backed by a new CMS.

Until then, site contents are available on the Legacy site.

Additionally, newer content written by our staff is available in a temporary subforum.

We sincerely apologize for the delays in recreating the site.


  • Legacy website
  • House rules
  • Upcoming projects
  • Project reviews

IRC Chatroom

Visit our chat and have a real-time talk with the regulars! Become one yourself!

  • Navigate: Home

Welcome to UnrealSP.org

  • Sign up or log-in to remove this introduction

For discussion about UnrealSP.org itself.

Newer articles published by UnrealSP.org staff can be found here until the main site’s renewal comes to fruition.

For gameplay advice and broader discussion of single-player Unreal including custom maps, mods and mutations that alter the game.

For questions and discussion about UnrealEd, UnrealScript, and other aspects of Unreal Engine design.

Post your fan fiction fan art threads here.

For discussion and promotion of Unreal Engine single-player campaigns, mapping projects and total conversions.

This forum is temporary and will be replaced by a new system.

For the discussion of projects currently in development. Read the instructions on how to make your project eligible.

For discussion of anything relating to multiplayer gameplay in the Unreal series, including co-op.

For random rambling. Please keep your posts tasteful and respectful.

Public forum for discussion of kea et al’s SP2D project.

For public discussion of all things EXU2.

Each week a single map is discussed here in detail.

Content from deprecated forums can be found here.

It may not be possible to post in some sub-forums.

Who is online

In total there are 32 users online :: 0 registered, 3 hidden and 29 guests (based on users active over the past 5 minutes)

Most users ever online was 195 on 14 Sep 2016, 11:07


Posted In: NEWS

Tags: , ,

Leave a Comment

What disk image should I use with VirtualBox, VDI, VMDK, VHD


Latest versions of VirtualBox supports several formats for virtual disks, but they forgot to provide a comparison between them.

Now, I am interested about a recommendation or comparison that considers the following:

  • be able to use dynamic sizing
  • be able to have snapshots
  • be able to move my virtual machine to another OS or even another free virtualization solution with minimal effort (probably something that would run fine on Ubuntu).
  • performance

asked Nov 23 ’11 at 0:28

VirtualBox has full support for VDI. VMDK. and VHD and support for Parallels Version 2 (HDD) (not newer versions) .

Answering Your Considerations

  • be able to use dynamic sizing

VDI. VMDK. and VHD all support dynamically allocated sizing. VMDK has an additional capability of splitting the storage file into files less than 2 GB each, which is useful if your file system has a small file size limit.

  • be able to have snapshots

All four formats support snapshots on VirtualBox.

  • be able to move my virtual machine to another OS or even another free virtualization solution with minimal effort (probably something that would run fine on Ubuntu).

VDI is the native format of VirtualBox. I didn’t search for any other software that supports this format.

VMDK is developed by and for VMWare, but Sun xVM, QEMU, VirtualBox, SUSE Studio, and .NET DiscUtils also support it. (This format might be the most apt for you because you want virtualization software that would run fine on Ubuntu. )

VHD is the native format of Microsoft Virtual PC. This is a format that is popular with Microsoft products.

I don’t know anything about HDD. Judging from looking at this site. Parallels is a Mac OS X product and probably isn’t suitable for you, especially considering that VirtualBox only supports an old version of the HDD format.

The format should not affect performance, or at least, performance impacts are negligible.

The factors that influence performance are:

  • your physical device limitations (much more noticeable on a hard disk drive than a solid-state drive. Why? )
  • expanding a dynamically allocated virtual disk drive (write operations are slower as the virtual disk expands, but once it’s large enough, expanding should happen less)
  • virtualization technology (hardware vs. software ; hardware virtualization helps VirtualBox and improves the speed of virtual operating systems)
  • the fact that you are running a virtual operating system. Performance is always slower than running an operating system on the host because of the virtualization process.

answered Jun 22 ’12 at 20:33

I always use VDI, as it is the native format of VirtualBox; however, using a VMDK (VMWare format) will increase compatibility with other virtual machine software.

VirtualBox will run fine on Ubuntu, so if the goal is Windows/Ubuntu interoperability, VDI would be a perfectly valid choice.

Both formats will fulfill your requirements.

As for the other two, VHD is a Microsoft-developed format, and HDD is an Apple-developed format; these are both proprietarily-licensed, so limit cross-platform support; I wouldn’t recommend them.

Mpack, explains a key performance difference between VHD and VDI here:

Having recently studied the VHD format, I would expect there to be at least a small difference in VDIs favor, most noticeable when you are comparing like with like, i.e. an optimized VDI vs optimized VHD. The reason is that the dynamic VHD format has these “bitmap” sectors scattered throughout the disk. Every time you modify a sector inside a block these bitmap blocks may need to be updated and written too, involving extra seeks, reads and writes. These bitmap sectors also have to be skipped over when reading consecutive clusters from a drive image – more seeks. The VDI format doesn’t have these overheads, especially if the VDI has been optimized (blocks on the virtual disk sorted into LBA order).

All of my comments apply to the dynamic VHD format vs dynamic VDI. Performance tests on fixed sized virtual disks is pointless since both formats are then the same (just a plain image of a disk), they just have different headers on them.

answered May 8 ’14 at 14:20

I don’t know if using vmdk would enable you to transparently run a virtual machine created in VirtualBox in VMware or not. It might. However a more universal option might be to use the VirtualBox File/Export function to create an “Open Virtualization Appliance” .ova file that can then be imported into VMware. With that approach, you can port to any virtualization system that supports .ova without caring what disk image format you use in VirtualBox.

If you need to export from the same VM at regular intervals, e.g. every day, that could be a pain. But if you only move to a different technology occasionally, it should be fine.

If you have a .vdi file already, You can test whether this works without having to create a new virtual machine. Export it to a .ova, then try importing with vmware.

answered Jul 3 ’12 at 21:22

A good reason for me for using vmdk is that Virtualbox (at least until v4.1) using VDI format has the tendency, over time, to fill the complete allocated disk space, even though the internal virtual disk usage is still much less. With Virtualbox using vmdk disks, this seems less of a problem.

But I’m talking years uptime. This might not be a problem many people encounter.

answered Jan 30 ’15 at 15:13

It s more related to the fragmentation of the guest file system than to the format itself. Enzo Jun 3 ’16 at 15:19

It depends on how you plan to use virtual disk as well. Not every VM wants a single partition on a single disk.

VDI seems to have more options (when used with VirtualBox), but as soon as you take VirtualBox out of the picture, support for VDI becomes somewhat shaky (as of late 2014).

For instance my solutions need to have maximum cross-platform support. Mounting a VDI (such as a loopback device) on linux or Windows 7 is harder and buggier than you might expect. Almost like the VDI has too many features, making it difficult to make fully conforming utilities that can operate on it.

VMDK is just less painless IMHO when you want it to work with any VM on any workstation, when you want to clone it 3 times to other systems on the network at the same time, and when you want to pry it open without launching a VM instance.

Even though I use VirtualBox 90% of the time, those few times when my disks become unaccessable in certain workflows have led me to favor VMDK for pluggable/shared filesystems.

answered Jan 8 ’15 at 4:33

Disk image files reside on the host system and are seen by the guest systems as hard disks of a certain geometry. When a guest operating system reads from or writes to a hard disk, VirtualBox redirects the request to the image file.

Like a physical disk, a virtual disk has a size (capacity), which must be specified when the image file is created. As opposed to a physical disk however, VirtualBox allows you to expand an image file after creation, even if it has data already; VirtualBox supports four variants of disk image files:

VDI: Normally, VirtualBox uses its own container format for guest hard disks — Virtual Disk Image (VDI) files. In particular, this format will be used when you create a new virtual machine with a new disk.

VMDK:VirtualBox also fully supports the popular and open VMDK container format that is used by many other virtualization products, in particular, by VMware.[25]

VHD:VirtualBox also fully supports the VHD format used by Microsoft.

Image files of Parallels version 2 (HDD format) are also supported.[26] For lack of documentation of the format, newer formats (3 and 4) are not supported. You can however convert such image files to version 2 format using tools provided by Parallels.

answered Nov 28 ’15 at 18:23

Looks like using VDI makes possible to trim disk file to its actual size VirtualBox and SSD s TRIM command support

answered Nov 19 ’16 at 0:23

While accurate it s a bit lackluster for a question that asks about the general differences between those formats, don t you think? Seth Nov 21 ’16 at 11:02


Posted In: NEWS

Tags: , ,

Leave a Comment

Using MERGE in SQL Server to insert, update and delete at


Using MERGE in SQL Server to insert, update and delete at the same time

By: Arshad Ali | Read Comments (34) | Related Tips: 1 | 2 | 3 | 4 | 5 | 6 | More > T-SQL

In a typical data warehousing application, quite often during the ETL cycle you need to perform INSERT, UPDATE and DELETE operations on a TARGET table by matching the records from the SOURCE table. For example, a products dimension table has information about the products; you need to sync-up this table with the latest information about the products from the source table. You would need to write separate INSERT, UPDATE and DELETE statements to refresh the target table with an updated product list or do lookups. Though it seems to be straight forward at first glance, but it becomes cumbersome when you have do it very often or on multiple tables, even the performance degrades significantly with this approach.� In this tip we will walk through how to use the MERGE statement and do this in one pass.


Beginning with SQL Server 2008, now you can use MERGE SQL command to perform these operations in a single statement. This new command is similar to the UPSERT (fusion of the words UPDATE and INSERT.) command of Oracle; it inserts rows that don’t exist and updates the rows that do exist. With the introduction of the MERGE SQL command, developers can more effectively handle common data warehousing scenarios, like checking whether a row exists, and then executing an insert or update or delete.

The MERGE statement basically merges data from a source result set to a target table based on a condition that you specify and if the data from the source already exists in the target or not. The new SQL command combines the sequence of conditional INSERT, UPDATE and DELETE commands in a single atomic statement, depending on the existence of a record. The new MERGE SQL command looks like as below:

The MERGE statement basically works as separate insert, update, and delete statements all within the same statement. You specify a “Source” record set and a “Target” table, and the join between the two. You then specify the type of data modification that is to occur when the records between the two data are matched or are not matched. MERGE is very useful, especially when it comes to loading data warehouse tables, which can be very large and require specific actions to be taken when rows are or are not present.

Putting it all together

In this example I will take a Products table as target table and UpdatedProducts as a source table containing updated list of products. I will then use the MERGE SQL command to synchronize the target table with the source table.

First Let’s create a target table and a source table and populate some data to these tables.

MERGE SQL statement – Part 1

Hi, thank you so much this is a very nice article because i was facing a problm how to differentiate wich row is updated in final target table and i got an idea by reading your article so i declared Modified_date in target table and in mathed condition i checked if any row values is updated at the time modified_date will change and easy to identify in final result.

Friday, February 20, 2015 – 6:54:03 AM – Shmuel Milavski

This was very helpfull to me ,

Monday, February 09, 2015 – 11:24:58 AM – Paul

Is there a way to show only the items that are not matched in the target table from the source table? I run the below script in sql 2008 and get the following which is great but I want to be able to see which record did not match in the source table so I don’t have to go through 120 lines of data manually. The only table and row I am interested in is “stgitem.avgunitcost” which is the source table

use test
truncate table dbo.stgitem

insert dbo.stgitem
FROM ‘C:\PriceUpdate\priceupdate.csv’

use test
merge dbo.item
using dbo.stgitem
on item.code = stgitem.code
when matched then
set item.avgunitcost = stgitem.avgunitcost,item.issuecost = stgitem.issuecost;

(120 row (s) affected)

(1 row (s) affected)

(1 row (s) affected)

(119 row (s) affected)

Friday, November 21, 2014 – 1:58:45 PM – Kimberly Ford

This was a fabulous article. I needed to share a few tables from my data warehouse with another team but didn’t want to give them access to the entire database. So, I create another database with just the tables I needed, used PART 2 and I have a fantastic method of keeping the tables in the “off database” updated! Now to just create my stored procedure and I’m set. Definitely a huge help.

Friday, November 07, 2014 – 4:05:03 PM – ola

Nice Article Ashad,

Am currently imploring the merge function in my code, and for some reason, the matched rows are not updated in the target table when the update was done, only the Not matched rows that got inserted into the target table.

Any suggestions please.

Monday, November 03, 2014 – 12:56:05 PM – Tejas

Nice article – very good example.

In your example above, when row does not exist in Source (WHEN NOT MATCHED BY SOURCE ) You are deleting the row from Target table.

In my case, I just want to update a flag on target that the row was marked as deleted (in Source).

SQL 2008R2 wont allow this:



Posted In: NEWS

Tags: , , , ,

Leave a Comment

Connecting to the Amazon SES SMTP Endpoint – Amazon Simple Email


Connecting to the Amazon SES SMTP Endpoint

The following table shows the Amazon SES SMTP endpoints for the regions in which Amazon SES is available.

US East (N. Virginia)

The Amazon SES SMTP endpoint requires that all connections be encrypted using Transport Layer Security (TLS). (Note that TLS is often referred to by the name of its predecessor protocol, SSL.) Amazon SES supports two mechanisms for establishing a TLS-encrypted connection: STARTTLS and TLS Wrapper. Check the documentation for your software to determine whether it supports STARTTLS, TLS Wrapper, or both.

If your software does not support STARTTLS or TLS Wrapper, you can use the open source stunnel program to set up an encrypted connection (called a “secure tunnel”), then use the secure tunnel to connect to the Amazon SES SMTP endpoint.

Amazon Elastic Compute Cloud (Amazon EC2) throttles email traffic over port 25 by default. To avoid timeouts when sending email through the SMTP endpoint from EC2, use a different port (587 or 2587) or fill out a Request to Remove Email Sending Limitations to remove the throttle.


STARTTLS is a means of upgrading an unencrypted connection to an encrypted connection. There are versions of STARTTLS for a variety of protocols; the SMTP version is defined in RFC 3207.

To set up a STARTTLS connection, the SMTP client connects to the Amazon SES SMTP endpoint on port 25, 587, or 2587, issues an EHLO command, and waits for the server to announce that it supports the STARTTLS SMTP extension. The client then issues the STARTTLS command, initiating TLS negotiation. When negotiation is complete, the client issues an EHLO command over the new encrypted connection, and the SMTP session proceeds normally.

TLS Wrapper

TLS Wrapper (also known as SMTPS or the Handshake Protocol) is a means of initiating an encrypted connection without first establishing an unencrypted connection. With TLS Wrapper, the Amazon SES SMTP endpoint does not perform TLS negotiation: it is the client’s responsibility to connect to the endpoint using TLS, and to continue using TLS for the entire conversation. TLS Wrapper is an older protocol, but many clients still support it.

To set up a TLS Wrapper connection, the SMTP client connects to the Amazon SES SMTP endpoint on port 465 or 2465. The server presents its certificate, the client issues an EHLO command, and the SMTP session proceeds normally.

Secure Tunnel

If your software does not support STARTTLS or TLS Wrapper, you can set up a secure tunnel to allow your software to communicate with the Amazon SES SMTP endpoint. As this option is most commonly used by mail server administrators, details are given under Integrating Amazon SES with Your Existing Email Server.


Posted In: NEWS

Tags: , ,

Leave a Comment

Linfo – Shows Linux Server Health Status in Real-Time #check #linux


Linfo Shows Linux Server Health Status in Real-Time

Linfo is a free and open source, cross-platform server statistics UI/library which displays a great deal of system information. It is extensible, easy-to-use (via composer) PHP5 library to get extensive system statistics programmatically from your PHP application. It’s a Ncurses CLI view of Web UI, which works in Linux, Windows, *BSD, Darwin/Mac OSX, Solaris, and Minix.

It displays system info including CPU type/speed ; architecture, mount point usage, hard/optical/flash drives, hardware devices, network devices and stats, uptime/date booted, hostname, memory usage (RAM and swap, if possible), temperatures/voltages/fan speeds and RAID arrays.


  • PHP 5.3
  • pcre extension
  • Linux /proc and /sys mounted and readable by PHP and Tested with the 2.6.x/3.x kernels

How to Install Linfo Server Stats UI/library in Linux

First, create a Linfo directory in your Apache or Nginx web root directory, then clone and move repository files into /var/www/html/linfo using the rsync command as shown below:

Then rename sample.config.inc.php to config.inc.php. This is the Linfo config file, you can define your own values in it:

Now open the URL http://SERVER_IP/linfo in web browser to see the Web UI as shown in the screenshots below.

This screenshot shows the Linfo Web UI displaying core system info, hardware components, RAM stats, network devices, drives and file system mount points.

Linux Server Health Information

You can add the line below in the config file config.inc.php to yield useful error messages for troubleshooting purposes:

Running Linfo in Ncurses Mode

Linfo has a simple ncurses-based interface, which rely on php s ncurses extension.

Now compile the php extension as follows

Next, if you successfully compiled and installed the php extension, run the commands below.

Verify the ncurses.

Linux Server Information

The following features yet to be added in Linfo:

  1. Support for more Unix operating systems (such as Hurd, IRIX, AIX, HP UX, etc)
  2. Support for less known operating systems: Haiku/BeOS
  3. Extra superfluous features/extensions
  4. Support for htop-like features in ncurses mode

For more information, visit Linfo Github repository: https://github.com/jrgp/linfo

That’s all! From now on, you can view a Linux system’s information from within a web browser using Linfo. Try it out and share with us your thoughts in the comments. Additionally, have you come across any similar useful tools/libraries? If yes, then give us some info about them as well.


Posted In: NEWS

Tags: , , , , , , ,

Leave a Comment

D-Link Products – DNS-320-110 – 1TB ShareCenter – 2-Bay Network Storage,


1TB ShareCenter 2-Bay Network Storage, SATA, RAID 0/1, JBOD, USB Print Server

This ShareCenter storage device comes with a 1TB hard drive so everyone on your network can back up and share their documents, music, photos, videos to a central location and access them remotely over the Internet right out of the box. Plus, it features a built-in Web File and FTP server so you can remotely access your files over the Internet.


  • The cost-effective way to store and share your documents, music, videos, and photos with anyone on your network
  • Ideal backup solution for households with more than 1 computer – no need to physically connect a USB drive to each computer to perform scheduled backups.
  • Included backup software allows users to protect important files by scheduling automatic backups on set timeline.
  • Share a USB printer over the network between all the computers in your house
  • Built-in Web File server and FTP server to access digital files remotely over the Internet
  • 2-bay device with 1x1TB SATA disk drive pre-installed and pre-formatted for plug ‘n go installation – for additional capacity, insert a second 3.5 SATA disk drive without any tools or attaching any cables
  • Connect to the network with a Gigabit port for fast transfer speeds
  • Support for RAID 1 (mirroring) to protect against data loss in the event of a disk drive failure
  • Easy management with user-friendly web-based GUI.
  • Complementary solution with other D-Link digital home products:
    • store and stream media content from your ShareCenter to your TV using your Boxee Box
    • store IP video to your ShareCenter as part of your home monitoring solution

Raid 1 Technology

RAID 1 is like a backup for your backup. It causes the drives in your storage device to mirror each other, so if one drive fails the unaffected drive will continue to function until the failed one is replaced.

Web File Server and FTP Server

The DNS-320 includes a built-in Web File server and FTP server to make accessing files remotely, over the Internet, a breeze. The FTP server will come in handy when you need to share a file too big to e-mail to a friend.

This ShareCenter device has a built-in media server that can stream your photos, music and videos to all of the computers in your home and to compatible media players like Boxee Box, Xbox and PlayStation 3 so you can enjoy it all on your TV.

10/100/1000 Gigabit Port

The 10/100/1000 Gigabit Ethernet port gives you blazing fast speeds so you can back up and access your data without the wait.

There s no overheating in this device s future. It comes with a quiet, built-in cooling fan that only comes on when needed, saving you power and money.

D-Link Easy Search Utility

The D-Link Easy Search Utility allows you to locate your ShareCenter from anywhere on the network and map your network drive so it will conveniently appear in My Computer on your PC.

The included feature-rich backup software allows you to create schedules and rules including full or incremental backups.

Not quite the handyman? That s ok. You can insert an additional hard drive without using any tools or attaching any cables.

This product allows hard drives to enter sleep mode when not in use to conserve electricity and prolong the life of the hard disk. Plus, it uses recyclable packaging and complies with the European Union s RoHS directive.

USB Print Server Port

This storage device can also serve as a USB print server, allowing you to share a single USB printer over your network. The USB port can also support a Universal Power Supply (UPS) monitor that supports smart signaling over a USB connection. If a power outage were to occur, the compatible UPS monitor would allow the UPS unit to safely shut down the ShareCenter.

AjaXplorer is a file explorer that allows you to remotely manage your files using a web browser. Stay protected – anywhere, anytime.

Not to worry, Mac users! ShareCenter can back up your data regardless of your computer’s operating system (Windows, Mac or Linux)

  • CPU
  • 800 MHz
  • 128MB RAM
  • Standards
  • IEEE 802.3 10Base-T Ethernet
  • IEEE 802.3u 100Base-TX Fast Ethernet
  • IEEE 802.3ab 1000Base-T Gigabit Ethernet
  • Support Hard Drive Type
  • 3.5 Internal SATA Hard Drive
  • Ports
  • 1 x 10/100/1000 Gigabit Ethernet
  • 1 USB Port
  • Power
  • Drive Management
  • Four Different Hard Drive Configurations (RAID 0, 1, JBOD, Standard)
  • Drive Status with E-mail Alerts
  • Drive Quotas
  • Power Management
  • Device Management
  • Internet Explorer v6 or other Java-enabled Browsers
  • LEDs
  • Power
  • LAN
  • HDD 1
  • HDD 2
  • Certifications
  • FCC Class B
  • CE
  • Power Supply
  • External Power Supply
  • DC 12V / 4A Switching
  • Power Consumption
  • Normal Mode: 15.7 W
  • Sleep Mode: 8.2 W
  • Power Management
  • Power Saving
  • Schedule Power Off
  • Auto Power Recovery Support
  • Operating Temperature
  • 30 to 104 F (0 to 40 C)
  • Operating Humidity
  • 10% to 95% Non-Condensing
  • Dimensions (W x D x H)
  • Item: 4.5 x 6.9 x 5.8 (115mm x 175mm x 149mm)
  • Packaging: 8.14 x 11.18 x 7.48 (206.76mm x 283.97mm x 189.99mm)
  • Weight
  • 3.37lbs
  • Warranty
  • 3 Year Limited*
  • Minimum System Requirements
  • Computer with:
  • 1GHz Processor
    512MB RAM
    200MB of Available Hard Disk Space
    Windows 7, Windows Vista or Windows XP SP2**
    CD-ROM Drive to View Product Documentation and Install Software
  • Package Contents
  • ShareCenter 2-Bay Network Storage with USB Printer Port
  • 1 x 1TB SATA hard drive
  • Power Adapter
  • Ethernet Cable
  • Quick Installation Guide
  • CD ROM with:
  • Installation Wizard
    Product Documentation

Hard drive(s) not included with DNS-320 model. DNS-320-110 includes 1 x 1TB SATA drive. An internal 3.5 SATA drive is required to store or share files and must be formatted before use. RAID 0 and RAID 1 require the use of two (2) SATA drives. To avoid drive incompatibility in RAID 0 or RAID 1 operation, use SATA drives from the same manufacturer. Formatted drive capacity for RAID 1 operation is dependent on the drive capacity of the lower-sized drive. May not work with older generation SATA drives. For a list of SATA drives that have been tested to work with the DNS-320 visit the FAQs section.

Use of an FTP Server over the Internet does not provide secure or encrypted transmissions, unless the end user enables SSL authentication in their FTP client.

D-Link cannot guarantee full compatibility or proper playback with all codecs. Playback capability depends on the codec support of the UPnP AV media player.

24/7 Basic Installation Support is available only in the USA for the first 30 days from date of original purchase.

D-Link makes no warranty as to the availability, reliability, functionality, and operation of the iTunes Server feature.

*3-Year Limited Warranty available only in the USA and Canada.

**Computer must adhere to Microsoft’s recommended System Requirements.

All references to speed and range are for comparison purposes only. Product specifications, size, and shape are subject to change without notice, and actual product appearance may differ from that depicted on the package. See inside package for warranty details.


Posted In: NEWS

Tags: , , , , , , , , , , , , ,

Leave a Comment

Monitoring Exchange 2013 with SCOM 2012 (Part 2) #exchange #server #monitoring


Monitoring Exchange 2013 with SCOM 2012 (Part 2)

Table 1: List of servers

Exchange 2013 Management Pack Pre-requisites and Considerations

Before importing the Exchange 2013 MP into System Center Operations Manager, there are some pre-requisites that have to be met:

You have one of the following versions of System Center Operations Manager deployed in your organization:

  • System Center Operations Manager 2012 RTM or later
  • System Center Operations Manager 2007 R2 or later
  • You have already deployed SCOM agents to your Exchange Servers.

    The SCOM agents on your Exchange Servers are running under the local system account.

    The SCOM agents on your Exchange Servers are configured to act as a proxy and discover managed objects on other computers.

    If you are monitoring Exchange Server 2013 Database Availability Groups (DAGs), ensure that all DAG members are monitored by Operations Manager.

    Before downloading and installing the Exchange Server 2013 MP, you might want to import some recommended additional management packs, such as (these are the ones I used):

    Add the Exchange servers as agent managed computers

    Adding the Exchange servers to monitor as agent managed computers is the first required step.

    1. Click the Administration tab and then click Configure computers and devices to manage on the Actions pane. This will start the Computer and Device Management Wizard (Figure 2). Click Next. choose Advanced Discovery (Figure 3) and select Servers Only from the Computers and Device Classes drop-down box.
    1. On the next window, browse for the computers you are adding (Figure 4) and click Next. Select Use selected Management Server Action Account (Figure 5), click Discover and wait for the discovery results (Figure 6). Figure 7 shows a brief summary that is displayed at the end of the wizard. It is mandatory that all systems running Exchange Server 2013 that are managed by Operations Manager use Local System as the Agent Action Account. Click Finish .
    1. If the agent installation was successful, on each Exchange server you ll be able to see the System Center 2012 Operations Manager Agent listed on the Programs and Features on Windows 2012 (Figure 8). A new service is also created, the System Center Management Service. as depicted in Figure 9.
    1. To enable Agent Proxy configuration on all managed Exchange servers, in the Administration pane, under Administration. Device Management. Agent Managed. right-click on each Exchange server (Figure 10), select Properties. then the Security tab (Figure 11), and check the box Allow this agent to act as a proxy and discover managed objects on other computers. This step will also make exchange cluster instances to appear in the Agentless Managed section (ensure that all physical nodes of the cluster are monitored). Repeat the process for every managed Exchange 2013 server in the list

    Create a new management pack for customizations

    The customizations and overrides of sealed management packs, such as the Exchange 2013 MP, are usually saved in the default management pack. As a best practice you should create and use an unsealed management pack for each sealed management pack that you want to override, as shown in Figure 12.

    Figure 12: Unsealed management packs

    Creating a new management pack for storing overrides has the following advantages:

    • It simplifies the process of exporting customizations that were created in your test and pre-production environments to your production environment.
    • It allows you to delete the original management pack without first needing to delete the default management pack.
    • It is easier to track and update customizations to individual management packs.
    1. In the Operations Console, click the Administration button. In the Administration pane, right-click Management Packs and then click Create Management Pack. The Create a Management Pack wizard displays.
    2. In the General Properties page (Figure 13), type a name for the management pack in Name. the correct version number in Version. and a short description in Description. Click Next and then Create .

    Install the Exchange Server 2013 MP

    With the recommended additional management packs already imported, download and install the latest Microsoft Exchange Server 2013 Management Pack (at the time of the writing of this article it was version 15.00.0620.018). You can find the latest Management Packs at the Microsoft System Center Marketplace .

    Let s look at the installation steps of the Exchange 2013 Management Pack:

    1. Download the management pack file and launch the Microsoft Installer (MSI) package on the selected SCOM server. Accept the license agreement, and click Next (Figure 14).
    1. Accept the default installation folder or select a new one. Click Next (Figure 15). The extraction process begins.
    1. When the extraction ends, click Close (Figure 16). When the installation is complete, the management pack files are copied to the System Center Management Packs folder.
    1. To import the Exchange 2013 MP, open the SCOM 2012 Operations Console. Click the Administration tab, right-click the Management Packs node and then click Import Management Packs (Figure 17).
    2. Click Add. Add from disk and then click No on the Online Catalog Connection window. Select all the files from the Exchange MP directory, by default C:\Program Files\System Center Management Packs (Figure 18), click Open and then click the Install button (Figure 19).
    1. The Import Management Packs page appears and shows the progress for each management pack. After the import process is complete and the dialog box displays an icon next to each Management Pack that indicates success of the importation (Figure 20), click the Close button. Click View and then Refresh. or press F5, to see the Microsoft Exchange Server 2013 management pack in the list of Management Packs.

    After importing the Exchange 2013 MP, it will start immediately discovering Exchange machines. So, if you browse to the Discovered Inventory pane on the Operations Console (Figure 21), all the Exchange 2013 servers should be listed. Notice that the Database Availability Group (DAG) is also listed, although its state is Not monitored. This is a normal behavior, since the physical nodes of the DAG are already being monitored.

    The Exchange 2013 MP adds 3 views to the Monitoring pane (Figure 22):

    • Active Alerts
    • Organization Health
    • Server Health

    Expand Microsoft Exchange Server 2013. and then click Server Health. Right click on one of the Exchange servers listed and click Open. Health Explorer. By default, the view is scoped to unhealthy child monitors (Figure 23). Click on Filter Monitors to clear the filter (Figure 24). Expand Entity Health to view the 4 health groups for Exchange Server 2013:

    • Customer Touch Points components with direct real-time, customer interactions (e.g. OWA).
    • Service Components components without direct, real-time, customer interaction (e.g. OAB generation).
    • Server Components physical resources of a server (e.g. disk, memory).
    • Key Dependencies server s ability to call out to dependencies (e.g. Active Directory).


    With the Exchange 2013 Management Pack imported, SCOM is now prepared to receive the escalated alerts from Managed Availability that require human attention.

    In the next part we ll take a deeper look into Managed Availability, specifically how to interact with it and perform some configuration tasks that used to be part of the process of installing and configuring the MP.

    If you would like to read the other parts in this article series please go to:


    Posted In: NEWS

    Tags: , ,

    Leave a Comment

  • Linux – Windows Fully Dedicated Servers #dedicated #servers, #server, #servers, #managed


    We offer Linux-based operating systems including Debian (our recommendation), Ubuntu, and CentOS at no extra charge.

    We also offer a choice of Microsoft Windows 2008 Server R2 Editions (Web, Standard and Enterprise), which cost from an additional /month.

    If you do not want to administer the server manually then we strongly recommend a control panel to allow you to administer the server and setup Web sites / email accounts via a Web-based interface. Our recommended control panel is cPanel/WHM. which costs an extra 25.00/month. We offer Fantastico for a further 15.00/month on our dedicated servers.

    We provide a free Magento (e-commerce solution) installer tool with cPanel/WHM. If you would like more information or advice on Magento hosting please contact us .

    We do not currently offer any control panel solutions for Windows, and do not recommend it for simple Web email hosting requirements.


    Posted In: NEWS

    Tags: , , , , , , , , , , , , , , , , , ,

    Leave a Comment

    MobaXterm Xserver with SSH, telnet, RDP, VNC and X11 #x #server,


    In order to install these plugins, just download them and put them in the same directory than MobaXterm executable.
    If you need to enhance MobaXterm with extra tools and commands, you can also use the MobApt package manager. type MobApt (or apt-get ) inside MobaXterm terminal.

    CygUtils.plugin. Collection of core UNIX tools for Windows

    Corkscrew. Corkscrew allows to tunnel TCP connections through HTTP proxies

    Curl. Curl is a command line tool for transferring data with URL syntax

    CvsClient. A command line tool to access CVS repositories

    Gcc, G++ and development tools. the GNU C/C++ compiler and other development tools

    DnsUtils. This plugin includes some useful utilities for host name resolution: dig, host, nslookup and nsupdate.

    E2fsProgs. Utilities for creating, fixing, configuring, and debugging ext2/3/4 filesystems.

    Emacs. The extensible, customizable, self-documenting real-time display editor

    Exif. Command-line utility to show EXIF information hidden in JPEG files.

    FVWM2. A light but powerful window manager for X11.

    File. Determines file type using magic numbers.

    Fontforge. A complete font editor with many features

    GFortran. The GNU Fortran compiler.

    Git. A fast and powerful version control system.

    Gvim. The Vim editor with a GTK interface

    Httperf. A tool for measuring web server performance.

    Joe. Fast and simple editor which emulates 5 other editors.

    Lftp. Sophisticated file transfer program and ftp/http/bittorrent client.

    Lrzsz. Unix communication package providing the XMODEM, YMODEM ZMODEM file transfer protocols.

    Lynx. A text-mode web browser.

    MPlayer. The ultimate video player

    Midnight Commander. Midnight Commander is a feature rich text mode visual file manager.

    Mosh. MOSH has been included into MobaXterm main executable in version 7.1 directly in the sessions manager. This plugin is deprecated.

    Multitail. Program for monitoring multiple log files, in the fashion of the original tail program.

    NEdit. NEdit is a multi-purpose text editor for the X Window System.

    Node.js. Node.js is a platform built on Chrome’s JavaScript runtime for easily building fast, scalable network applications. This plugin does not include NPM.

    OpenSSL. A toolkit implementing SSL v2/v3 and TLS protocols.

    PdKsh. A KSH shell open-source implementation.

    Perl. Larry Wall’s Practical Extracting and Report Language

    Png2Ico. Png2Ico Converts PNG files to Windows icon resource files.

    Python. An interpreted, interactive object-oriented programming language.

    Ruby. Interpreted object-oriented scripting language.

    Screen. Screen is a terminal multiplexer and window manager that runs many separate ‘screens’ on a single physical character-based terminal.

    Sqlite3. Software library that implements a self-contained, serverless, zero-configuration, transactional SQL database engine.

    SquashFS. mksquashfs and unsquashfs tools allow you to create/unpack squashfs filesystems from Windows.

    Subversion (SVN). Subversion is a powerful version control system.

    Tcl / Tk / Expect. Tcl is a simple-to-learn yet very powerful language. Tk is its graphical toolkit. Expect is an automation tool for terminal.

    X11Fonts. Complete set of fonts for X11 server.

    X3270Suite. IBM 3270 terminal emulator for Windows.

    XServers. Xephyr, Xnest, Xdmx, Xvfb and Xfake alternate X11 servers.

    Xmllint. A command line XML tool.

    Zip. Zip compression utility.

    Snaphat also bundled some other plugins (Python, Cmake, Graphviz, Lua, Readline and CGDB) that you can download from his website .
    Rthomson also bundled a “Tmux” plugin that you can download from his website .

    Sources for each plugins are available here.
    Each license can be found in the corresponding source package.


    Posted In: NEWS

    Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , ,

    Leave a Comment

    Server Monitoring Software (SaaS), Server Performance Monitoring Tools #saas #server #monitoring


    Server monitoring software (SaaS) from Anturis

    Deploy Windows and Linux monitoring in minutes and scale with ease

    Robust, yet simple and affordable, Anturis hosted server monitoring software (SaaS) for both Windows and Linux allows you to:

    • Easily set up monitoring of critical server performance metrics, applications, and networks.
    • Receive alerts notifying you of incidents.
    • Analyze historical and real-time server statistics.
    • Get an eagle s-eye view of the health of your entire infrastructure.

    A server monitoring tool that works perfectly for both in-house and cloud servers

    Anturis provides server performance monitoring for them all – physical Windows and Linux in-house servers as well as AWS, Microsoft Azure, Rackspace, and DigitalOcean off in the cloud.

    Set up server health monitoring by simply running a single shell command or through a standard Windows installer wizard – either will automatically install the agent and configure a typical server monitoring setup. There is no need to configure a SNMP daemon or WMI service. All that is required is to run the agent on your servers.

    Track important server metrics with ease

    • Server performance monitoring. CPU usage, CPU load, RAM, disk space, disk usage, SMART attributes.
    • Applications monitoring. MySQL database, Apache server, log files, custom scripts, swap usage, OS processes, Windows services, Active Directory, JVM, Windows Event Logs.
    • External and internal monitoring via the following protocols. ICMP, TCP, SSH, FTP, SMTP/SMTPs, IMAP/IMAPs, POP3/POP3s, HTTP/HTTPs.
    • Network monitoring. Ping, SNMP devices, packet loss, jitter, latency, printer, network interfaces.

    Email, SMS, and voice call notifications

    Anturis server monitoring tool offers customizable email, SMS, and voice call alerts and employs techniques that help eliminate false alerts and alert spam (multiple alerts for the same problem).

    Alerting can be configured according to:

    • Problem/incident severity.
    • Components dependencies.
    • Monitor status-change rules.
    • Warning and error thresholds.
    • Staff members responsibilities.

    Infrastructure schema view with impact-dependencies modeling

    Anturis server monitoring software allows you to built dependency charts that show how network devices, databases, servers, and other components are related, and can include such information in alerts.

    The hierarchical arrangement helps to avoid notification spam (in case several components fail at the same time) and to get an eagle s-eye view of the health of your entire infrastructure.

    External monitoring from multiple geographic locations

    Today s global economy means that your network must be available globally. Anturis employs global coverage to ensure that your websites, applications, and services are available from diverse regions and countries. Our multiple global locations allow you not only to monitor your servers’ availability, but also to compare the response time and full-web-page load time from various locations. Anturis maintains locations in the U.S. Canada, South America, Western Europe, Russia, China, and Singapore.

    Complete network monitoring

    Your network consists of more than just your servers, and the Anturis monitoring tool covers a wide variety of other tasks, such as checking for open ports, confirming the status and connectivity of your printers, and monitoring the QoS of your VoIP system. Anturis monitors virtually all SNCP-enabled devices, such as routers and switches.

    Data presentation and reports

    Get the information you need to keep your entire network in top shape through network maps, charts, tables, and dashboards. Review daily, weekly, or monthly reports to analyze your servers performance and uptime metrics. You choose the timeframe and the format.

    Log In

    Failed to confirm the account

    Email confirmed

    Forgot Your Password?


    Posted In: NEWS

    Tags: , ,

    Leave a Comment

    The best free parental control software 2017 #best #free #server #monitoring



    The best free parental control software 2017

    Keep your children safe online with the best free parental control software for Windows

    It’s hard to imagine anything less child-friendly than an uncensored internet. A rabid wolf, maybe, or a playground floored with broken glass and razor wire. The more connected we become the more we need everybody online – and that means trying to ensure that our children aren’t exposed to the very worst content, ideas and behaviour that exist online.

    Software can’t do everything, of course, but it can help to make parents’ lives much easier. These are our picks of the best parental control tools.

    1. Qustodio

    A full suite of parental control tools to protect kids from the worst of the web

    Most parental control software is aimed at Windows, but Qustodio (think ‘custodian’) is also available for Mac, Android. iOS, Kindle and (weirdly) Nook.

    The free version covers the basics, enabling you to set rules and time schedules, block pornography and other unsuitable content; if you go for the paid-for version that adds SMS monitoring, social media features and per-app controls. But even the free version is one of the most comprehensive parental control apps around.

    Its raft of features and support for a wide range of platforms make Qustodio the best free parental control software, but there are some other excellent free programs available, some of which may be better suited to your individual needs as a parent. Read on for our top choices.

    2. OpenDNS Family Shield

    Block domains on your whole home network at router level

    FamilyShield is a free service from OpenDNS. Its parental control tools automatically block domains that OpenDNS has flagged under the headings “tasteless, proxy/anonymizer, sexuality, or pornography”.

    One of the big pluses here is that while FamilyShield can run on PCs and mobile devices, you can also apply it to your network router and filter all the traffic that passes through it it’s just a matter of changing the DNS server numbers in your control panel.

    This has the happy benefit of improving DNS lookup speeds on some ISPs. By filtering everything at the router level, every device on your network benefits from the filters.

    3. Kidlogger

    Comprehensive activity logging so you know what your kids have been up to

    Nothing gets past Kidlogger. This free parental control software not only tracks what your children type and which websites they visit it also keeps a record of which programs they use and any screengrabs they take.

    If you’re concerned about who your kids might be talking to online, there’s even a voice-activated sound recorder. If your children are a little older and more responsible, you can pick and choose which options to monitor and give them a little privacy.

    The free software only covers one device and lacks some of the sneakier features of the premium editions (including silent monitoring of WhatsApp conversations and the ability to listen to phone calls), but it’s still a well-rounded tool if you’re concerned about your kids’ safety.

    4. Spyrix Free Keylogger

    Find out what the kids have been typing, and if they might be in trouble

    Keyloggers have something of a bad reputation online, as they’re often used by villains, but they can be a force for good too, and Spyrix ‘s features enable you to see what your children have been up to.

    Although it’s dubbed parental control software, it’s really a monitoring program; it doesn’t stop the kids getting up to no good, but it does let you see exactly what they’ve done.

    That means it isn’t really appropriate for younger children’s computers, but it may be useful for older children if you suspect online bullying or other unpleasantness.

    5. Zoodles

    A whole browser designed for younger kids, but quite easily circumvented

    The problem with many parental control apps is that they’re most effective for older children: while filtering adult content and other unpleasantness is obviously a good thing, there’s plenty of stuff that isn’t adult that can still scare younger children silly.

    Zoodles addresses that by combining filtered browsing and a dedicated web browser to create a walled garden: everything in it is safe for kids and there’s no risk of anything awful popping up.

    In addition to Windows, Zoodles is also available for Mac, Android and iOS, and a brand new version is currently in development.


    Posted In: NEWS

    Tags: , , , ,

    Leave a Comment

    Understanding GPO in Windows Server 2012 #server #2012, #group #policy, #windows


    Understanding GPO in Windows Server 2012

    Group Policies are computer or user settings that can be defined to control or secure the Windows server and client infrastructure. Understanding GPO in Windows Server 2012 before actually configuring and applying policy settings is very important. It is easy to understand GPO in Windows Server 2012. There are some new features of GP O in Windows Server 2012 .

    Understanding GPO in Windows Server 2012

    Two main components of GPO are, GPO Object and GPO Policy Settings.

    GPO Object. GPO Object is an active directory object that has various group policy settings. These policy settings can be user settings or computer settings and can be applied to user or computers. GPO objects are stored in GPO container. The GPO object is stored in active directory database and each object has its own unique GUID (Globally Unique Identifier).

    GPO Policy Settings. GPO policy settings are the real settings within GPO object that defines particular action. GPO policy settings comes from GPO templates which are stored in SYSVOL folder of each domain controller. For example, Prohibit Access to Control Panel is a GPO policy setting that will simply disable access to control panel. Most of the GPO settings can be enabled, disabled or not configured. The example is shown below,

    When you create a group policy, the GPO object is created and stored in GPO container in active directory and at the same time, GPO template is created and stored in SYSVOL folder. After creating a group policy, it can be linked to Sites, Domains and OUs. Group policy is process in the order of LSDOU.

    1. Local Group Policy
    2. Sites
    3. Domains
    4. Organizational OUs

    There are certain things that you should remember while creating and applying GPO settings. As stated earlier there are computer settings and user settings of each GPO object. Computer settings are applied at startup of the client machine. User settings are applied at use logon. Policies refresh can be initiated manually by using, C:\ gpupdate /force command or C:\ Invoke-Gpupdate powerShell cmdlet.

    In fresh domain controller there are two default group policy settings configured. They are:

    1. Default Domain Policy. This policy is linked to the entire domain and has policies like password policies, account lockout policies and kerberos protocol policies. It is recommended that not to edit this policy. If you want to link new group policy then create new GPO and link to the domain.
    2. Default Domain Controller Policy. This policy setting is applied to domain controllers and is linked to domain controllers OU. This policy affects domain controllers only.

    You may also like –

    • Understanding Logical Structure of Active Directory
    • Install Domain Controller in Windows Server 2012
    • Install Remote Desktop Services in Windows Server 2012
    • Create User Account in Server 2012 Domain Controller
    • Configure Shadow Copy of Shared Folder in Server 2012
    • Configure FTP Server in Windows Server 2012
    • Script to Create AD User Accounts from MS Access File


    Posted In: NEWS

    Tags: , , , ,

    Leave a Comment

    HP ProLiant ML110 G7 review #hp #proliant #server #ml


    HP ProLiant ML110 G7 review

    Page 1 of 2 HP ProLiant ML110 G7 review

    The new ProLiant ML110 G7 is designed primarily for small businesses with limited on-site IT skills looking for their first server. It’s also the first HP product to sport Intel’s latest Xeon E3 processor. It goes head-to-head with Dell’s PowerEdge T110 II. also equipped with an Intel Xeon E3, which took over at the top of the A-List last month as our favourite pedestal server.

    Can this ProLiant topple the newly crowned Dell? It certainly makes a good start. The HP scores higher for remote server management, since it features an ML110 embedded iLO3 controller. This is the same controller that’s found in all the high-end ProLiant servers, first appearing in the ProLiant DL380 G7 .

    The iLO3 shares access with the first of the server’s two Gigabit Ethernet ports, but you can buy an optional upgrade that adds a dedicated management port. These features make the ProLiant ML110 G7 well suited to remote sites or IT providers, since it can be accessed easily over the internet for full remote diagnostics and, with the optional upgrade, remote control.

    The ML110’s power options are also superior: either a fixed 350W supply or up to two 460W hot-plug supplies. The review system included a 460W hot-plug supply with a second module costing around £155 exc VAT extra. The server is easy on the power, too, with our inline power meter clocking the HP at only 35W with Windows Server 2008 R2 idling. With SiSoft Sandra fully exercising the eight logical cores of the Xeon E3 processor, this peaked at only 97W.

    HP is offering a host of processor options: along with Core i3, there’s a choice of five Xeon E3 models. The 3.3GHz Xeon E3-1240 in the review system sits in the middle of this group, but you can save cash and opt for the slightly slower 3.1GHz E3-1220. This is fitted in the base server model, which costs only £455 exc VAT.

    The HP’s storage is exemplary. A lockable front panel hides a hard disk cage with four removable drive carriers. In the base model the cage is wired directly to the embedded B110i SATA RAID controller, which supports stripes, mirrors and cold-swap drives. For hot-swap and SAS support you can specify an HP Smart Array RAID P212 or P410 PCI Express card. The latter has a pair of quad-port SAS connectors, and with this in place you can use the optional SFF bay that supports eight hot-swap 6Gbits/sec SAS, nearline SAS or SATA hard disks.

    Page 1 of 2 HP ProLiant ML110 G7 review


    Posted In: NEWS

    Tags: , , ,

    Leave a Comment

    Connecting to a Fax Server (Windows) #how #to #setup #a #fax


    Connecting to a Fax Server

    In the Win32 Environment

    To establish the connection to the local fax server, a fax client application must call the FaxConnectFaxServer function successfully before it calls any other fax client function. FaxConnectFaxServer returns a fax server handle that is required to call many other fax client functions. The FaxConnectFaxServer function can only be used only with a local server.

    Note that a connection to a fax server is not required to print a fax document to a fax printer device context. An application can provide transmission information directly to the fax client Windows Graphics Device Interface (GDI) functions, and transmit the active document by printing it to a printer device context. The fax client GDI functions include the FaxStartPrintJob and the FaxPrintCoverPage functions. For more information, see Printing a Fax to a Device Context.

    Call the FaxClose function to disconnect from the fax server and deallocate the handle that the FaxConnectFaxServer function returns.

    In the COM Implementation Environment

    If you are writing a C/C++ application, you must call the CoCreateInstance function to retrieve a pointer to an IFaxServer interface and create an instance of a FaxServer object. Then you must call the IFaxServer::Connect method to initiate a connection with an active fax server. The server connection is required before the application can access most interfaces that begin with IFax. (A fax server connection is not required to access an IFaxTiff interface.) For more information about creating a FaxServer object, and for a list of properties and methods, see IFaxServer .

    If you are writing a Microsoft Visual Basic application, you must call the Visual Basic CreateObject function to create an instance of a FaxServer object. Then you must call the Connect method of the FaxServer object to initiate a connection with an active fax server. The server connection is required before the application can access most other objects that begin with Fax. (A fax server connection is not required to access a IFaxTiff object.) See FaxServer object (Visual Basic) for more information about the steps required to create the object, and for a list of properties and methods of the object.

    Related topics

    Build date: 5/5/2012


    Posted In: NEWS

    Tags: , , , , ,

    Leave a Comment

    Hotel Downtown Atlanta #atlanta #dedicated #server



    Cookie policy

    Four Seasons uses “cookies” to give you the best experience. Please refer to the section on cookies in our Privacy Policy here for a description of how we use cookies to enhance your experience on www.fourseasons.com. We have recently changed our policy on cookies. To find out more, click here. If you accept and want to continue your session with cookies please click Accept. You may follow this link to learn how to manage cookies through your web browser: http://www.aboutcookies.org/how-to-control-cookies/. By continuing to use this site without changing your settings you consent to our use of cookies in accordance with our Privacy Policy.

    Cookies on the Four Seasons Website

    Four Seasons uses “cookies” to give you the best experience. Please refer to the section on cookies in our Privacy Policy here for a description of how we use cookies to enhance your experience on www.fourseasons.com. We have recently changed our policy on cookies. To find out more, click here. If you accept and want to continue your session with cookies please click Accept. You may follow this link to learn how to manage cookies through your web browser: http://www.aboutcookies.org/how-to-control-cookies/. By continuing to use this site without changing your settings you consent to our use of cookies in accordance with our Privacy Policy.

    Cookie policy

    Four Seasons uses “cookies” to give you the best experience. By using this website, you consent to our use of these cookies. We have recently changed our policy on cookies. To find out more, click here .

    Cookies on the Four Seasons Website

    Four Seasons uses “cookies” to give you the best experience. By using this website, you consent to our use of these cookies. We have recently changed our policy on cookies. To find out more, click here .


    Posted In: NEWS

    Tags: , ,

    Leave a Comment

    SOLUTION High Availability SQL Server Interview Question? #microsoft #sql #server, #microsoft


    High Availability SQL Server Interview Question?

    That means an always on witness server that is talking to the same disks on a local SAN unit. You can’t really do that from a distance.

    Really Jimpen? I can’t believe you confidently posted this.
    First, by centralised storage, I meant a SAN, not a NAS.

    Secondly, anyone can implement geographical redundancy for both 1. servers and 2. the storage. as well long as there’s a high-speed fibre network to support this kind of implementation.

    How do you think organisations do carry on with business uninterrupted when there’s an entire site shutdown or an entire shutdown of one of their key datacentres?

    I will explain how to implement a two node HA solution using two sites over a GAN:
    One can implement HA across a metropolitan area network with two SAN site replicated across the two sites over a high-speed fibre connectivity.

    We’ve done this using both EVA NetApp. What’s key is implementing storage replication across the two sites. With NetApp, you can use snap-mirroring.
    You need two SAN directors, one at each geographically separated site and this is where you will connect your fibre to the servers (via HBA cards).

    So site 1 will have a server, a SAN director and your replicating SAN.
    Site 2 will also have the same thing.
    Then clustering will take place conveniently and this is the true meaning of HA where even your storage is also highly available.

    Let me know if you have any questions or if you need help to implement this. I can help.

    Come on Jimpen,
    You’re contradict yourself but I will help to shed some light get you back on track.
    When I stated that the servers can be geographically separated, you responded, Not really , meaning that the servers Cant be geographically separated .

    You certainly were not talking about the cost of implementation. If you had wanted to talk about expenses, you would have stated so.

    Are you probably trying to run away from what you’ve already written here? lol. Of course you can’t.
    Keep things simple: when you state something at EE, always let it be what you mean because we’re resolving a query.

    If I am to examine everything you’ve stated, it looks like you didn’t know know that the implementation I talked about is possible so you’ve mixed things up.

    If a client has a single SAN, two data centres (meaning two geographical locations) and two SAN directors, then the LUNs scanned (via HBAs) will be the same presented to the different servers and they will be off the same SAN irrespective of where the servers are connecting from.
    I figured out that you didn’t know this is possible basing on your response in response ID: 38971591. where you stated that Also SQL needs the HBA’s to be the same to work correctly.

    That’s why you talked about a NAS while responding, when you said that You want a SAN unit, not a NAS. in the same comment ID: 38971591 ; you didn’t consider the networking required across the WAN/GAN/MAN i.e, between the different data centres.
    The good news is that two or more SAN directors can be configured to work this way, hence you cannot say not really in response.

    Lastly, I would like to hint on your concept: a remote location SAN system as you state above. This rather is more of implementing HA at storage level, which also comes in handy for DR purposes, based on how you look at it and use it at a point in time.

    Bottom line: if you were trying to give an answer that is in a small normal range as you state, then you would have not said that that seems abnormal is not possible. It is indeed possible.

    Please let me know if you need any more info, or if you want to implement this anywhere in the world. so feel free.
    I hope we’re now on the same page 🙂


    Posted In: NEWS

    Tags: , , ,

    Leave a Comment

    Dell Compellent Enterprise Manager #dell #server #management #software


    Dell Compellent Enterprise Manager
    Unmatched storage control and simplicity


    Enterprise Manager simplifies storage resource management by providing comprehensive monitoring and management of all local and remote Dell Compellent Storage Center systems. You can gain instant visibility and control of a multi-terabyte, multilocation environment, streamlining administration and reducing operating costs. Configure and verify remote replication processes, monitor storage capacity and disk utilization in real time, and generate comprehensive storage usage and performance reports — all from a single pane of glass.

    Storage Resource Management

    Dell Compellent Enterprise Manager simplifies network storage management by providing a single, centralized console for the administration of multiple local and remote Storage Center SANs. Users can configure and verify remote replication processes, monitor storage capacity and disk utilization in real time, and generate comprehensive enterprise storage usage and performance reports.

    How to Simplify Storage Admin with Enterprise Manager

    • Complete storage resource management for Dell Compellent enterprise storage environments
    • Drastically cuts day-to-day management time, resources and technology training
    • View your system from the standpoint of capacity, performance and path utilization, all from a granular point in time
    • Cuts administration time with a single interface and a complete view of storage resources
    • Simplifies disaster recovery process with reduced configuration time and easy online replication verification
    • Reduces disk costs with reports that allow you to accurately assess storage resources and plan for future capacity needs
    • Speeds event resolution with centralized alert notification and event log management
    • Optimizes performance by allowing you to identify and manage trends
    • Showcases the cost and power savings of Dell Compellent storage
    • Maximize resource utilization and reduce disk costs using accurate capacity and performance data
    • Streamline disaster recovery planning and replication configuration with a simple point-and-click interface
    • Speed event resolution with centralized alert notification and event log management
    • Identify trends and monitor enterprise storage use by business unit for accurate needs assessment and chargeback
    • Automatically calculate energy savings and generate boardroom-ready hero reports

    Key Benefits:

    Complex management limits the benefits of virtualization

    Managing complex storage tasks like replication and capacity planning for multi-terabyte SANs at multiple locations can be daunting, particularly if you require more than one interface. Operational costs and management complexity can grow exponentially with your enterprise, making administration even more difficult. Companies require a better and easier way to monitor and manage their storage area networks.

    Comprehensive SAN management

    Dell Compellent Enterprise Manager simplifies administration of Dell Compellent environments by providing comprehensive monitoring and management of all local and remote Storage Center SANs. You can drastically cut day-to-day SAN management time with a single interface that provides a complete view of your storage environment, streamlining administration and reducing operational costs.

    With Enterprise Manager, you can reduce costs by maximizing utilization and purchase storage more efficiently using accurate capacity and performance data. Streamline disaster recovery by reducing replication planning and configuration time with a point-and-click interface that helps you set up remote replication in as few as six clicks.

    Multiple systems, single interface

    With Enterprise Manager, all local and remote Storage Center systems are discovered using a single console. This centralized interface provides a complete view of all aspects of your Dell Compellent storage environment, significantly reducing storage administration time. You can choose from a variety of comparative views across all Storage Center systems, including total space available, used space, free space, number of volumes, number of Replays, number of replications, and savings vs. RAID 10.

    Easy-to-use reporting for informed decisions

    Enterprise Manager’s extensive system monitoring provides immediate insight into your storage environment, while easy-to-use reports summarize capacity utilization, replications and events. Make informed decisions and showcase your success as you decrease storage expenditures, simplify management, streamline disaster recovery and increase data center efficiency. Enterprise Manager lets you automatically e-mail reports daily, weekly or monthly, determine storage costs associated with the different tiers of capacity in your environment, translate storage technologies into actual dollar savings with boardroom-ready hero reports, and provide a powerful foundation for a green IT strategy by automatically calculating energy costs and CO2 emissions.

    Reduce spending and optimize performance

    The ability to accurately assess storage resources and plan for future capacity and performance needs reduces overall disk spending. Enterprise Manager simplifies capacity planning, increasing your storage purchase efficiency. You can view storage capacity utilization on all of your Storage Center systems over a period of time, including summaries from last week, last month, or last year, as well as important I/O details to help optimize performance.

    Easily view current and historical consumption for specific storage volumes over time to balance server loads and increase server purchase efficiency. Storage consumption reports include total space available, allocated space, used space and configured space for all disks and total space and used space for any RAID selection with any disk tier.

    You can gain valuable insight into current writable and historical and optimize capacity and performance planning with threshold alerts for storage and I/O usage and CPU and memory utilization information.

    Detailed information helps you purchase storage more efficiently.

    Simplify replication management

    Enterprise Manager simplifies disaster recovery with rremote replication setup in just a few clicks. This streamlined replication management also allows online replication verification and rapid recovery with a single-click disaster declaration.

    Using Enterprise Manager, you can easily and accurately estimate bandwidth requirements upfront based on actual data. Advanced bandwidth-shaping algorithms allow you to utilize the lowest bandwidth required while maintaining optimal performance. And you can monitor replication and bandwidth utilization over time to understand requirements and optimize transfer rates on an hourly basis throughout the day.

    Certified VMWare vCenter™ Site Recovery Manager (SRM) integration helps you enhance disaster recovery protection for your VMware virtualized environment.

    Quickly implement replication configurations that were previously too complex or time-consuming.

    Chargeback calculates true costs of storage

    Enterprise Manager’s storage-based chargeback feature automatically calculates storage costs based on the actual space consumed by applications. Administrators can assign different costs to each volume based on disk class or storage tier. Enterprise Manager monitors storage utilization and generates reports that identify the cost of storage consumption based on department name or account number. Chargeback reports can be scheduled for automatic email delivery to business units daily, weekly, monthly or quarterly.

    Powerful SRM with unmatched simplicity

    Enterprise Manager delivers powerful storage resource management software for Dell Compellent Storage Center SANs paired with unmatched control and simplicity. The complete Enterprise Manager suite features the following licensed software components: Enterprise Manager Foundation, Discovery, Replication Management, VMWare SRM Adapter, Event Management, Free Space Recovery, Enterprise Manager Reporter, Performance Management, Capacity Management, Threshold Alerting, Enterprise Manager Chargeback, Storage-Based Chargeback, Hero Reports, Power Savings.

    Technical Specifications:


    Posted In: NEWS

    Tags: , , ,

    Leave a Comment

    Index of #ubuntu #server #monitor


    Ubuntu 15.04 (Vivid Vervet)

    Select an image

    Ubuntu is distributed on two types of images described below.

    Desktop image

    The desktop image allows you to try Ubuntu without changing your computer at all, and at your option to install it permanently later. This type of image is what most people will want to use. You will need at least 384MiB of RAM to install from this image.

    There are two images available, each for a different type of computer:

    64-bit PC (AMD64) desktop image Choose this to take full advantage of computers based on the AMD64 or EM64T architecture (e.g. Athlon64, Opteron, EM64T Xeon, Core 2). If you have a non-64-bit processor made by AMD, or if you need full support for 32-bit code, use the i386 images instead. 32-bit PC (i386) desktop image For almost all PCs. This includes most machines with Intel/AMD/etc type processors and almost all computers that run Microsoft Windows, as well as newer Apple Macintosh systems based on Intel processors. Choose this if you are at all unsure.

    Server install image

    The server install image allows you to install Ubuntu permanently on a computer for use as a server. It will not install a graphical user interface.

    There are two images available, each for a different type of computer:

    64-bit PC (AMD64) server install image Choose this to take full advantage of computers based on the AMD64 or EM64T architecture (e.g. Athlon64, Opteron, EM64T Xeon, Core 2). If you have a non-64-bit processor made by AMD, or if you need full support for 32-bit code, use the i386 images instead. 32-bit PC (i386) server install image For almost all PCs. This includes most machines with Intel/AMD/etc type processors and almost all computers that run Microsoft Windows, as well as newer Apple Macintosh systems based on Intel processors. Choose this if you are at all unsure.

    snappy Ubuntu Core images

    The snappy Ubuntu Core image allows you to install Ubuntu permanently on a computer, using the snappy transactional update system.

    There are two images available, each for a different type of computer:

    64-bit PC (AMD64) snappy image Choose this to take full advantage of computers based on the AMD64 or EM64T architecture (e.g. Athlon64, Opteron, EM64T Xeon, Core 2). If you have a non-64-bit processor made by AMD, or if you need full support for 32-bit code, use the Intel x86 images instead. ARMhf BeagleBone Black snappy image For BeagleBone Black boards.

    A full list of available files, including BitTorrent files, can be found below.

    If you need help burning these images to disk, see the Image Burning Guide .


    Posted In: NEWS

    Tags: , ,

    Leave a Comment

    Welcome to Open Campus! #mcse, #mct, #mcp, #microsoft #certified, #systems #engineer,


    Our Mission.

    The mission of RCCD Distance Education (formerly Open Campus) is to extend access to learning through distance education. Objectives: To facilitate learning at a distance, Distance Education provides:

    • Educational technology to the colleges, faculty, and students to support the delivery of online-based courses and services
    • Professional development and training for faculty
    • Expertise and experience
    • Blackboard management, production and problem solving.

    As of April 21, the Open Campus department has changed its name to Distance Education. We thank you for your patience as we work to complete the changes to website by the middle of summer.

    What Are Online Based or Distance Education classes?

    Online-based courses, also called Distance Education classes, may take two different forms:

    Online classes are taken exclusively over the Internet. Please note that, while some online courses provide all instructional content over the Internet, others may require some on-campus meetings. Please see the course schedule or WebAdvisor for more information.

    Hybrid classes meet both on campus and online. Think of them as a combination or blending of online classes and face-to-face classes.

    In a hybrid class, you will attend meetings on campus during the dates and times listed in the schedule of classes. Since the on-campus portion of hybrid classes could take place at any of our three colleges (Riverside City, Norco or Moreno Valley), hybrid classes are listed in the schedule by the college where the on-campus meetings will take place.

    Are there Face-to-Face classes that use the Internet?

    Web-Enhanced classes are traditional face-to-face classes that are supplemented with course websites and the use of Internet resources. Unlike hybrid or fully-online classes, all web-enhanced class meetings take place on campus.

    Where Do I Start?

    For more information about Blackboard and support, visit
    Students page


    Posted In: NEWS

    Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

    Leave a Comment

    Dedicated Server – Colocation – Virtual Private – Malaysia Hosting #dedicated


    Dedicated Server

    Business class dedicated server that provides greater flexibility, space and bandwidth, and unmatched reliability. A dedicated server is a single web server within a network of computers, dedicated solely to one customer, most often a large business. It is a type of Internet hosting in which the client leases an entire server not shared with anyone. A dedicated server is typically a rented service. The user rents the server, software and an Internet connection from the Web host. This is more flexible than shared hosting, as organizations have full control over the servers, including choice of operating system, database, hardware and specific applications. Dedicated server are most often housed in data centers.

    • Intel Dual Core E6500 3.06Ghz
    • 2GB DDR RAM
    • NO Setup Fee
    • Free Remote Reboot
    • Free KVM over IP (On request)
    • Unmetered Bandwidth

    Only $99 /mo

    Colocation Services

    Colocation is a hosting option for small business who want the features of a large IT department without the costs. Colocation allows you to place your server in a rack in our data center and share our bandwidth as your own. It generally costs more than standard web hosting, but much less than the cost of a comparable amount of bandwidth into your place of business, along with the power required to run and effectively and consistently cool the equipment. Once you have a machine set up, you take it physically to the location of the colocation provider and install it in their rack or you rent a server machine from the colocation provider. The company then provides an IP, bandwidth, and power to your server. Once it s up and running, you access it much like you would access a Web site on a hosting provider. The difference being that you own the hardware. With IT and communications facilities in safe, secure hands, you can enjoy less latency and the freedom to focus on your core business

    • 1U Single Co-Location
    • 10Mbps Shared Bandwidth
    • unmetered Monthly Traffic
    • Advanced Layer 2 Firewall
    • Cat 6 Cable with Gigabit LAN
    • 99% Uptime Guarantee

    Only $80 /mo

    Virtual Private Server

    A virtual private server (VPS) is a single server partitioned into multiple sections that each function as an independent server. It is a hosting environment that combines both benefits of shared and dedicated hosting. It gives you the freedom to configure your server any way you want and costs less than buying your own server. Each virtual server can run its own full-fledged operating system, and each server can be independently rebooted. A VPS is the low cost solution that gives you all that functionality and requires less technical knowledge than having your own server. You have complete control of your server software, but we maintain the hardware and keep it up and running for you.

    • Cpu Resources: 400Mhz
    • 20GB Hardware Raid-10
    • NO Setup Fee
    • 256MB RAM
    • 260GB Monthly bandwidth
    • 1 Dedicated IP

    Only $19.99 /mo


    Posted In: NEWS

    Tags: , , , , , ,

    Leave a Comment

    GoDaddy Dedicated Server Review #godaddy #dedicated #server #review, #godaddy #dedicated #server


    GoDaddy Review

    Price: $62.99 (2x120GB HDD, 2GB RAM)

    GoDaddy also has some of the best prices around. Even when taking advantage of some of their greatest packages, the pricing is still very minimal compared to some alternatives.

    When it comes to dedicated servers, GoDaddy.com has a few strides over the competition. Their many different packages and programs allow you choice and the ability to get what you actually need – nothing you don t. What you re probably looking for a piece of mind, security and the knowledge that you can trust in the service provider you have just chosen without all the hassle that can come with it. Here s what GoDaddy offers you with their dedicated servers to help put your mind at ease.

    GoDaddy caters to both Windows and Linux servers with a wide range of operating systems and prices that can suit your needs. Each package is unique in its own way and gives you features from their fastest level of processing to their largest online storage. These are some of the many benefits and sweet perks you ll receive as a customer of one of their hosting plans:

    The packages range from Economy, Deluxe, Premium, Value Deal, Power Player and Storage Monster, depending on the types of services that you require.

    Bandwidth ranges widely from 500GB / month (Economy) to 3,000GB / month (Premium) that allow you to use and maintain the amount of data that you need to upload and run on your site

    The Economy and Value Deal packages offer 2GB RAM for the modest user, while the others offer 4GB RAM, except the 64 bit options, which offer a full 8GB RAM.

    Year round you are backed by 24/7 support. They maintain their servers, keeping a watchful eye over your website to ensure that any and all threats are caught before they can do any damage.

    Not only do they offer security support, but they also offer 24/7 phone support for all your hosting questions, comments or concerns

    All the dedicated server plans that GoDaddy offers come with these perks on top of the service you are getting: Up to $129.93 in Google Adwords Credit, $103.95 Bing / Yahoo Search Credit, $51.97 Facebook Advertising Credit, SSL Certificate, 3 dedicated IP Addresses, 10 Fotolia Credits and Bandwidth Overage Protection

    Their plans offer the operating systems Ubuntu, CentOS, Fedora and Windows so that you have a wide array of places to start from

    In addition to having both Linux and Windows plans and packages offered, they also have 64 bit plans that allow for the best possible performance and even up to 8 GB RAM. These packages give you up to 2, 64 GB solid-state drives for all your memory needs and an allowable 3,000 GB per month for bandwidth.

    All of GoDaddy s servers are protected with Tipping Point Intrusion Protection Systems. An advanced piece of software that constantly scans packages and determines whether they are malicious or perceived as a threat. The filters are updated on a regular basis to ensure that the most up-to-date protection is offered.

    Prices range from #65.47 / month (Economy) up to $280.65 /month (Premium) on a 12 month contract. The longer you sign up for, the better the deal will get. 1, 6, 12 and 24 month contracts are available for the deal that you think will be best. Their 64 bit package pricing ranges from $187.09 to $392.92 / month on a 12 month commitment. Longer term commitments are available if you should require better deals

    GoDaddy seems to have you covered when it comes to getting the best deal of the day. As opposed to other hosting companies, GoDaddy allows you to install almost anything on the server and run it, provided it s not malicious, host different websites on just one account and a dedicated server that has very versatile uses like gaming, virtual hosting and high-traffic websites. Whether you are the micro-business or individual just looking for a dedicated server, GoDaddy is worth being considered before you make your selection.


    Posted In: NEWS

    Tags: , , , , ,

    Leave a Comment

    What – s the difference between a CTE and a Temp


    This is pretty broad, but I’ll give you as general an answer as I can.

    • Are unindexable (but can use existing indexes on referenced objects)
    • Cannot have constraints
    • Are essentially disposable VIEW s
    • Persist only until the next query is run
    • Can be recursive
    • Do not have dedicated stats (rely on stats on the underlying objects)
    • Are real materialized tables that exist in tempdb
    • Can be indexed
    • Can have constraints
    • Persist for the life of the current CONNECTION
    • Can be referenced by other queries or subprocedures
    • Have dedicated stats generated by the engine

    As far as when to use each, they have very different use cases. If you will have a very large result set, or need to refer to it more than once, put it in a #temp table. If it needs to be recursive, is disposable, or is just to simplify something logically, a CTE is preferred.

    Also, a CTE should never be used for performance. You will almost never speed things up by using a CTE, because, again, it’s just a disposable view. You can do some neat things with them but speeding up a query isn’t really one of them.

    answered Feb 15 ’12 at 16:54

    Please see Martin’s comments below:

    The CTE is not materialised as a table in memory. It is just a way of encapsulating a query definition. In the case of the OP it will be inlined and the same as just doing SELECT Column1, Column2, Column3 FROM SomeTable. Most of the time they do not get materialised up front, which is why this returns no rows WITH T(X) AS (SELECT NEWID())SELECT * FROM T T1 JOIN T T2 ON T1.X=T2.X. also check the execution plans. Though sometimes it is possible to hack the plan to get a spool. There is a connect item requesting a hint for this. – Martin Smith Feb 15 ’12 at 17:08

    A CTE creates the table being used in memory, but is only valid for the specific query following it. When using recursion, this can be an effective structure.

    You might also want to consider using a table variable. This is used as a temp table is used, is also in-memory only, and can be used multiple times without needing to be re-materialized for each join. Also, if you need to persist a few records now, add a few more records after the next select, add a few more records after another op, then return just those handful of records, then this can be a handy in-memory structure.

    Temp Table

    A temp table is literally a table created on disk, just in a specific database that everyone knows can be deleted. It is the responsibility of a good dev to destroy those tables when they are no longer needed, but a DBA can also wipe them.

    Temporary tables come in two variety: Local and global. In terms of MS Sql Server you use a #tableName designation for local, and ##tableName designation for global (note the use of a single or double # as the identifying characteristic).

    Notice that with temp tables, as opposed to table variables or CTE, you can apply indexes and the like, as these are legitimately tables in the normal sense of the word.

    Generally I would use temp tables for longer or larger queries, and CTEs or table variables if I had a small dataset already and wanted to just quickly script up a bit of code for something small. Experience and the advice of others indicates that you should use CTEs where you have a small number of rows being returned from it. If you have a large number, you would probably benefit from the ability to index on the temp table.

    The accepted answer here says “a CTE should never be used for performance” – but that could mislead. In the context of CTEs versus temp tables, I’ve just finished removing a swathe of junk from a suite of stored procs because some doofus must’ve thought there was little or no overhead to using temp tables. I shoved the lot into CTEs, except those which were legitimately going to be re-used throughout the process. I gained about 20% performance by all metrics. I then set about removing all the cursors which were trying to implement recursive processing. This was where I saw the greatest gain. I ended up slashing response times by a factor of ten.

    CTEs and temp tables do have very different use cases. I just want to emphasise that, while not a panacea, the comprehension and correct use of CTEs can lead to some truly stellar improvements in both code quality/maintainability and speed. Since I got a handle on them, I see temp tables and cursors as the great evils of SQL processing. I can get by just fine with table variables and CTEs for almost everything now. My code is cleaner and faster.

    answered Feb 17 ’12 at 9:49

    The primary reason to use CTEs is to access Window Functions such as row_number() and various others.

    This means you can do things like get the first or last row per group VERY VERY quickly and efficiently – more efficiently than other means in most practical cases .

    You can run a similar query to the above using a correlated subquery or by using a sub-query but the CTE will be faster in almost all scenarios.

    Additionally, CTEs can really help simplify your code. This can lead to performance gains because you understand the query more and can introduce more business logic to help the optimizer be more selective.

    Additionally, CTEs can boost performance if you understand your business logic and know which parts of the query should be run first – typically, put your most selective queries first that lead to result sets that can use an index in their next join and add the option(force order) query hint

    Finally, CTEs don’t use tempdb by default so you reduce contention on that bottleneck through their use.

    Temporary tables should be used if you need to query the data multiple times, or alternatively if you measure your queries and discover that by inserting to a temp table and then adding an index that your performance is improved.

    answered Oct 8 ’13 at 0:50

    There seems to be a bit of negativity here towards CTE’s.

    My understanding of a CTE is that it’s basically a kind of adhoc view. SQL is both a declarative and a set based language. CTE’s are a great way of declaring a set! Not being able to index a CTE is actually a good thing because you don’t need to! It’s really a kind of syntactic sugar to make the query easier to read/write. Any decent optimizer will work out the best access plan using indexes on the underlying tables. This means you could effectively speed up your CTE query by following the index advice on the underlying tables.

    Also, just because you defined a set as a CTE, it doesn’t mean that all rows in the set must be processed. Dependent on the query the optimizer might process “just enough” rows to satisfy the query. Maybe you only needed the first 20 or so for your screen. If you built a temp table then you really do need to read/write all those rows!

    Based on this I would say that CTE’s are a great feature of SQL and can be used anywhere they make the query easier to read. I would only think about a temp table for a batch process that would really need to process every single record. Even then afaik it’s not really recommended because on a temp table it’s far harder for the database to help you with caching and indexes. It might be better to have a permanent table with a PK field unique to your transaction.

    I have to admit that my experience is mainly with DB2 so I’m assuming that CTE’s work in a similar way in both products. I will happily stand corrected if CTE’s are somehow inferior in SQL server. 😉

    answered Mar 24 ’14 at 22:30

    2017 Stack Exchange, Inc


    Posted In: NEWS

    Tags: , ,

    Leave a Comment

    Hybrid Business Intelligence with Power BI #sql #server, #powerbi, #hybrid #business


    Hybrid Business Intelligence with Power BI

    This week in the social media chatter, I noticed tweets regarding a new Microsoft white paper by Joseph D Antoni and Stacia Misner published to TechNet on Hybrid Business Intelligence with Power BI. This white paper is a fantastic technical overview and a must-read for groups looking at Power BI, wondering how to best implement it with existing on-premises business intelligence BI, or Azure Infrastracture as a Service (IaaS) hosted BI. Covered topics include:

    • hybrid BI technical architecture options
    • data management gateway
    • best practices for:
      • integrating security
      • identity management
      • networking
      • Office 365

    Aside from small businesses that may only have cloud hosted solutions, many businesses currently have a combination of cloud and on-premises data sources. Just think about how many groups use Salesforce.com, Google Analytics, Constant Contact, and other departmental cloud applications. Typically, I see those groups leveraging APIs or connectors to bring cloud data back on site into a local data warehouse for creating reports. We are taking those same concepts quite a bit further with Microsoft Azure and Power BI.

    Ideally, we are no longer moving all of the data in our big data world. Concepts like data virtualization, for example, are becoming more popular. Most likely, we are now tasked to deliver a transparent Microsoft BI experience across Office 365 and existing on-premises SharePoint portals or data sources.

    Understanding how to architect hybrid-BI scenarios is becoming a more important skill to master in our profession. However, prior to this new white paper, finding the answers and best practices for it was fairly challenging.

    Security in a Hybrid World

    Upon a brief skim through this new technical whitepaper, I noticed a lot of content around networking and identity management. Historically, identity management and security in Microsoft BI has not been easy to master. In a hybrid BI world, these topics appear to be comparable or even a bit more complex.

    Let s face it, getting through a SharePoint 2013 BI farm installation and configuration can be a daunting process for even the top talent in the world. I usually advise to folks considering a new SharePoint 2013 BI farm installation to first read Kay Unkroth s incredible white paper to understand SharePoint security, Microsoft BI security, and Kerberos delegation concepts.

    Managing user security in Office 365 looks comparable to on-premises SharePoint security. There are options to federate Active Directory (AD) to Office 365 and use Single Sign On (SSO). There are additional alternatives for multi-factor authentication in scenarios where you require additional layers of security.

    In hybrid BI scenarios where you have Analysis Services or Reporting Services hosted on Microsoft Azure VMs, you might also need to configure Azure AD, AD Federation Services (ADFS), and the Azure Active Directory Sync tool to synchronize passwords, users, and groups between on-premises AD and Azure AD supporting the Office 365 installation. The new Hybrid Business Intelligence with Power BI white paper goes into detail on those concepts and includes links to a plethora of excellent resources.

    Data Management Gateway for Power BI

    At the moment, Data Management Gateway appears to be the key to hybrid BI with Office 365 Power BI. The Data Management Gateway is a client agent application that is installed on an on-premises server and copies data from internal data sources to the Power BI cloud data source format.

    Office 365 Power BI data sources are a bit of a cloud data island per se, but over time it should continue to evolve. Present Power BI Data Refresh capabilities, basically Excel workbooks deployed to a Power BI site, can have a single data refresh schedule from the following supported data sources:

    • On-premises SQL Server (2005 and later)
    • On-premises Oracle (10g and later)
    • Azure SQL Database
    • OData feed
    • Azure VM running SQL Server

    Now, if you have a VPN connection and Azure virtual network, it opens up many more potential data sources for Power BI. In that case, accessing data sources with Power BI data connections and scheduled refresh is similar to on-premises Power Pivot except it sure looks like you still need Data Management Gateway to get that data into Power BI-land. The white paper section labeled Power BI Data Refresh goes into deep detail on supported data sources, data refresh schedules, and various data location scenarios.

    Sending Feedback to Microsoft

    We are just beginning to see Microsoft BI and Power BI in a cloud and hybrid world. Groups that are using Power BI and hybrid BI today are early adopters. We would all benefit from hearing about their tips, tricks, and lessons learned. I see a lot of continual changes in Azure and total confusion out here especially around Azure cloud BI and Power BI with on-premises data sources.

    If you have Microsoft technical content requests, you can send feedback to the teams that develop these resources to get new topics on their radar. Don t assume someone else has already expressed a need. If no one asks or complains, the folks in Redmond may be completely unaware of that need. It really is that simple.


    Posted In: NEWS

    Tags: , , , , , , , , , , , , , , , ,

    Leave a Comment

    What is Gateway? Webopedia Definition #node, #network, #gateway, #communication, #server, #lan,



    Related Terms

    (n.)(1) A node on a network that serves as an entrance to another network. In enterprises, the gateway is the computer that routes the traffic from a workstation to the outside network that is serving the Web pages. In homes, the gateway is the ISP that connects the user to the internet.

    In enterprises, the gateway node often acts as a proxy server and a firewall. The gateway is also associated with both a router. which use headers and forwarding tables to determine where packets are sent, and a switch. which provides the actual path for the packet in and out of the gateway.

    (2) A computer system located on earth that switches data signals and voice signals between satellites and terrestrial networks.

    (3) An earlier term for router. though now obsolete in this sense as router is commonly used.

    gateway antivirus

    Related Links



    Stay up to date on the latest developments in Internet terminology with a free weekly newsletter from Webopedia. Join to subscribe now.

    From keyword analysis to backlinks and Google search engine algorithm updates, our search engine optimization glossary lists 85 SEO terms you need. Read More

    Microsoft Windows is a family of operating systems for personal computers. In this article we look at the history of Microsoft operating. Read More

    From Goats to Penguins, a server outage and trillions of searches, our slideshow presents interesting facts about Google and the Google.com. Read More

    Java is a high-level programming language. This guide describes the basics of Java, providing an overview of syntax, variables, data types and. Read More

    This second Study Guide describes the basics of Java, providing an overview of operators, modifiers and control Structures. Read More

    The Open System Interconnection (OSI) model defines a networking framework to implement protocols in seven layers. Use this handy guide to compare. Read More


    Posted In: NEWS

    Tags: , , , , , , , , , ,

    Leave a Comment

    Dynamic PIVOT in Sql Server #concatenate #in #sql #server



    Dynamic PIVOT in Sql Server

    In the Previous Post PIVOT and UNPIVOT in Sql Server explained how PIVOT relational operator can be used to transform columns distinct values as Columns in the result set by mentioning all the distinct column values in the PIVOT operators PIVOT columns IN clause. This type of PIVOT query is called Static PIVOT query, because if the PIVOT column in the source table get s extra unique values after the initial query then that will not reflect in the PIVOT query result unless it is mentioned in the PIVOT Columns IN clause. Static PIVOT queries are fine as long as we know that the PIVOT column values never change, for instance if PIVOT column values are MONTH or Day of the Week or hour of the day etc.

    In this Article will present how we can write a Dynamic PIVOT query with an example, where we don t need to mention the PIVOT columns each unique values and no need to worry if PIVOT column gets extra unique values after the initial query.

    First Create a Temporary Table #CourseSales with sample records as depicted in the below image by using the following script:

    PIVOT #CourseSales Table data on the Course column Values

    Let us first understand the Static PIVOT query and then see how we can modify this Static PIVOT query to Dynamic.

    Static PIVOT query

    Below Static PIVOT script pivots the #CourseSales Table data so that the Course columns distinct values are transformed as Columns in the result set as depicted in the above image.

    Let us insert one more row in the #CourseSales table for the new course SQL Server with below insert statement.

    From the above result it is clear that the newly added course Sql Server sales data is not reflected in the result.

    Dynamic PIVOT Query

    To make the above Static PIVOT query to dynamic, basically we have to remove the hardcoded PIVOT column names specified in the PIVOT operators PIVOT columns IN clause. Below query demonstrates this.

    From the above result it is clear that this query is a True Dynamic PIVOT query as it reflected all the courses in the #CourseSales table without needing to write hardcoded course names in the PIVOT query.

    Examples of PIVOT and Dynamic PIVOT

    Below are the some of the examples of retrieving data in Sql Server using PIVOT and Dynamic PIVOT:

    Post navigation

    Deepak Bhise says:

    I executing below query, where error is Incorrect Syntax Near Pivot

    with taxdet as
    (Select tbi.bill_item_id,tbi.Sr_number, tbi.item_id,tbi.total_amt, tbt.tax_amt, tbt.sub_tax_id,
    tbm.Bill_no, tbm.Bill_amount, tbm.current_table_no, tstm.SubTax_Name
    from tblBill_Items tbi
    Right Join tblBill_taxes tbt on tbt.Corelation_id = tbi.bill_item_id
    Right Join tblBill_master tbm on tbm.Sr_number = tbi.Sr_number
    Left Join tblSubTax_master as tstm on tstm.SubTax_id = tbt.sub_tax_id
    where tbt.tax_id = 1 and tbm.isSettle = 1 and tbm.Bill_no 0)
    Select * from taxdet
    Select * into DPKTAX from (Select * from Taxdet)
    set nocount on

    this is great but,
    this is not dynamic at all, because at the top of the query you are trying to find the pivot columns and writting these columns hardcoded so this is not exactly dynamic but half dynamic :)) but however this is great post thanks.

    We Can eliminate Null values in the pivoted list as below

    Vu Hong Trieu says:

    Thanks tut,
    So i try this but my solution useful
    1. My data here:

    Execute the Dynamic Pivot Query

    4. So i want to my result that:

    Owais Ahmed Memon says:

    i have one table contain two columns one is Type_of_case and other is Stage Both

    i want to prepare lookup table

    i tried your example its working ok but i am un able to make it for my table
    heres the list of all colums for instance

    SELECT TOP 1000 [Case_id]
    FROM [WM_Cases].[dbo].[Cases_main]

    i need this result
    Type_of_case Total Of Case_id Case Closed Closure Finding Received
    Admin 6 1 5
    Age 70 1 65 4
    Decipline 54 2 35 8

    and here is the result (I am shortening field lables)
    Patient ID Document Number Claim N CARE U007
    1000 289 1 76.58 19.53

    However I need the result be like:

    Patient ID Document Number Claim N Insu1 Insu2 Paid1 Paid2
    01000 289 1 CARE U007 76.58 19.53

    My query is working successfully, but some of the columns are not showing up. For instance, in the Name Column, I have the values Object , Use , and Size that I am trying to use as column headers, but only the Use and Size are appearing as headers. I can see Object as a value in the table I m pivoting from, so I don t know why it s hidden. Can anyone help?

    Subscribe to Blog via Email


    This is my personal blog site. The opinions expressed here represent my own and not those of my employer. For accuracy and official reference refer to MS Books On Line and/or MSDN/TechNet. My employer do not endorse any tools, applications, books, or concepts mentioned on the blog. I have documented my personal experience on this blog.




    Sql Server Tutorial

    Recent Posts


    SqlHints.com © 2016. All Rights Reserved.


    Posted In: NEWS

    Tags: , , ,

    Leave a Comment

    Free Windows Server 2008 Trial for 240 Days #download #windows #server


    Free Windows Server 2008 Trial for 240 Days

    Tweet on Twitter

    Free Windows Server 2008 trial is available by Microsoft to promoting the popularity of Windows Server 2008 over server operating system market. Since Microsoft Windows Server 2008 is the latest generation of Windows Server series and predecessor Windows Server 2003.

    Windows Server 2008 builds based on code base as Windows Vista; therefore, it shares much of the same architecture and functionality. while delivering valuable new functionality and powerful improvements to the base operating system to provide solid foundation for information technology (IT) infrastructure.

    A look at new Windows Server 2008 functionality:

    Windows Server 2008 has features rich upgrade with numerous functional advantages over its predecessors windows server 2003. Here are some of the changes in this release that I feel will have the biggest customer impact.

    • Server Manager. the expanded Microsoft Management Console (MMC), provides a one-stop interface for server configuration and monitoring with wizards to streamline common server management tasks.
    • Windows PowerShell. a new optional command-line shell and scripting language, enables administrators to automate routine system administration tasks across multiple servers.
    • Windows Reliability and Performance Monitor provides powerful diagnostic tools to give you ongoing visibility into your server environment, both physical and virtual, to pinpoint and resolve issues quickly.
    • Optimized server administration and data replication for increased control over servers located in remote locations, such as a branch office.
    • Componentized Server Core installation option allows minimal installations where only the server roles and features you need are installed, reducing maintenance needs and decreasing the available attack surface of the server.
    • Windows Deployment Services (WDS) provides a simplified, highly secure means of rapidly deploying Windows operating systems to computers by using network-based installation.
    • Failover clustering wizards make it easy for even IT generalists to implement high-availability solutions, Internet Protocol version 6 (IPv6) is now fully integrated, and geographically dispersed cluster nodes no longer need to be on the same IP subnet or configured with complicated Virtual Local Area Networks (VLANs).
    • Network Load Balancing (NLB) now supports IPv6 and includes multiple dedicated IP address support which allows multiple applications to be hosted on the same NLB cluster.
    • Windows Server Backup incorporates faster backup technology and simplifies data or operating system restoration.

    As Microsoft provides free download Windows Server 2008 trial software on various edition like standard. DataCenter. Enterprise and Web Server for users and developers to evaluate and try. The free Windows Server 2008 trial version download is initially offer 60 days (expendable to 240 days), without the need for product activation or entering a product key). Although it’s mentioned as trial software, the installed copy of Windows Server 2008 can be activated with a valid and genuine product key and used as full version product with no expiry date or grace period for activation.

    Free Download links for Windows Server 2008 Edition:

    Do note that the download will be big normally between 1 to 5 GB.


    Posted In: NEWS

    Tags: , , , , ,

    Leave a Comment

    Capacity Planning, IT Capacity Planning Reports, Infrastructure Capacity Planning and Trending


    Capacity Planning and Reporting

    With ManageEngine Applications Manager’s capacity planning and reporting ability you can monitor, measure and report your enterprise’s capacity, ensuring it s always efficiently allocated.

    ManageEngine Applications Manager supports over 100 performance metrics that can aide capacity planning for a heterogeneous set of applications and IT resources ranging from servers, databases, applications, web servers and web services.

    Accurate capacity analysis of resources

    Get complete capacity management for both physical and virtual servers with instant graphs on critical performance metrics such as

    • CPU utilization
    • Memory utilization
    • Disk I/O utilization
    • Network utilization

    >Measure overall resource usage as well as usage by workload.

    Troubleshoot capacity problems and allocate capacity to where it s needed.

    With configurable threshold settings you can identify servers and virtual machines that can be right-sized so that its workload can get sufficient capacity.

    Discover capacity bottlenecks even in dynamic VMware environments.

    Get proactive alerts and locate the source of the capacity problem by drilling into the performance metrics.

    Capacity reports filtered over various customized time periods.

    Measure overall resource usage with detailed reports of excess or insufficient capacity from undersized, oversized and idle servers/virtual machines.

    Reports can be exported in PDF, Excel, CSV and e-mail formats.

    Unix and Solaris Capacity Planning

    The server monitoring and capacity planning abilities help plan virtualization efforts with capability to monitor CPU, memory, disk, network traffic, etc. It also helps understand how resource usage varies across time

    Oracle Capacity Planning

    ManageEngine Applications Manager provides the ability to monitor performance of various Oracle Applications. You can plan capacity for Oracle database, Oracle Application Servers (Oracle AS), Oracle WebLogic, Oracle E-Business Suite, etc. In addition to capacity planning, there is support for Oracle database monitoring. Oracle E-Business Suite Performance Monitoring and Oracle application server monitoring.

    IBM Capacity Planning

    ManageEngine Applications Manager provides the ability to monitor the performance of IBM Applications. We support for performance management and capacity planning for IBM WebSphere, WebSphere MQ, IBM DB2 and AS/400 or IBM iSeries.

    Infrastructure Capacity Planning


    Posted In: NEWS

    Tags: , , , , , , , , , , ,

    Leave a Comment

    Learn cloud computing technologies and test your skills with Cloud Academy


    Cloud Training Has Never Been Easier

    Automate the onboarding process for new team members, easily manage user privileges, and track progress of each member’s skills and your organization’s overall cloud training goals.

    Team Progress Monitoring

    Monitor your organization’s training progress with key performance metrics for understanding how much time teams and individual members are investing in cloud training, track the quality of their responses in quizzes, and more.

    Cloud Academy for Teams includes gamification features to engage and motivate your teams to keep their cloud skills up to date.

    Learn More

    Case Studies

    At Cloud Academy, we’ve helped secure an ever-increasing number of successful enterprise cloud migration and training projects around the world. Our vendor neutral, continuous learning platform is designed to help IT professionals and managers quickly learn new cloud computing skills, acquire cloud certifications, and stay up to date with the latest technologies.

    Explore how other global enterprises and startups have used our platform

    Cloud Academy Learning Paths direct our employees through a focus area, allowing them to begin a long journey pointing in the optimal direction.

    It’s far more cost effective and less disruptive than sending someone to a week (or weeks) of training. People can learn just what they need, when they need it, at a time and in a place that is convenient for them.

    We achieved outstanding results with Cloud Academy. The team really enjoys working with the platform, the gamification of the e-learning process works great to motivate them and grow.

    Meet some of our customers

    Founder, CEO at CloudCheckr

    The Cloud Academy platform was great. Quizzes and prep courses gave me a thorough understanding of the material on the certification test.

    Cloud Consulting Manager

    The others have courses, but it’s not a complete package. You guys definitely have more!

    IT Systems Architect at SpaceX

    Cloud Academy has a lot of content covering a diverse set of topics. I enjoyed that a good number of the non-basic services are covered.

    All content on CloudAcademy.com is the sole property of Cloud Academy, Inc. Rackspace and Rackspace Logo are a registered trademark of Rackspace US, Inc. Amazon Web Services (AWS) and Amazon Web Services Logo are a registered trademark of Amazon Web Services, Inc. Google and the Google Logo are registered trademarks of Google Inc. Azure and Azure Logo are registered trademarks of Microsoft Corporation.
    CloudAcademy Inc. is not affiliated with Rackspace US, Inc. or Amazon Web Services, Inc. or Amazon.com, Inc. or Google, Inc. or Microsoft Corporation and has no claim or interest in any mark owned by Rackspace US, Inc. or Amazon Web Services, Inc. or Amazon.com, Inc. or Google, Inc. or Microsoft Corporation. Our services are not authorized, sponsored, approved, certified or endorsed by Rackspace US, Inc. or Amazon Web Services, Inc. or Amazon.com, Inc. or Google, Inc. or Microsoft Corporation.
    All trademarks, service marks, trade names, trade dress, product names and logos appearing on the site are the property of their respective owners. Any rights not expressly granted herein are reserved.

    Terms of Use – Privacy Policy Copyright 2017 Cloud Academy Inc. All rights reserved.

    start your 7-day free trial become a member Unlock this lab upgrade to Register for free

    This lab is only available with a Cloud Academy subscription.

    Because our labs give you access to a live production environment, a subscription is required for selected labs.

    To unlock this lab, skip your trial and subscribe to Cloud Academy!

    Full access to the library

  • Check
    No charges for 7 days
  • Check
    Cancel anytime
    • Check
      Hands-on Labs
    • Check
      Full access to the library
    • Check
      New content every month
    • Check
      Enjoy all our features
    • Check
      New content every month
    • Check
      Downloadable Courses
    • Check

      Cloud Academy Certificates


    Posted In: NEWS

    Tags: , ,

    Leave a Comment

    Log Analysis in Hadoop – Hadoop Online Tutorials #server #log #analysis


    Log Analysis in Hadoop 5

    Table of Contents

    Log Files:

    Logs are computer-generated files that capture network and server operations data.They are useful during various stages of software development, mainly for debugging and profiling purposes and also for managing network operations.

    Need For Log Files:

    Log files are commonly used at customer’s installations for the purpose of permanent
    software monitoring and/or fine-tuning. Logs are essential in operating systems, computer networks, distributed systems and storage filers. Uses of Log File analysis are:

    • Application/hardware debugging and profiling
    • Error or Access statistics will be useful for fine tuning the application/hardware functionality For example. based on the frequency of an error message in the past 6 months, we can forecast its occurrence in the future and before it s occurrence on customer s application/hardware, if we can provide a fix for the error, then customer satisfaction will be improved which in turn business will increase.
    • Security monitoring of Application/hardware For example. if we suspect a security breach, we can use server log data to identify and repair the vulnerability.

    Log File Types:

    These log files can be generated from two types of servers Web Servers and Application Servers.

    Web Server Logs:

    Web servers typically provide at least two types of log files: access log and error log .

    • Access log records all requests that were made of this server including the client IP address, URL, response code, response size, etc.
    • Error log records all requests that failed and the reason for the failure as recorded by the application.

    For Example, logs generated by any web server like Apache logs. logs of this site hadooptutorial.info provided in the following sections of this post.

    Application Server Logs:

    These are the logs generated by applications servers and the custom logs generated by these can provide a great level of detail for application developers and analysts to understand how the application is used. Since developers can log arbitrary information, application server logs can be even larger than web server logs.

    For example, logs generated by Hadoop daemons can be treated as Application logs.

    Challenges in Processing Log Files:

    • As the log files are being continuously produced in various tiers with different types of information, the main challenge is to store and process this much data in an efficient manner to produce rich insights into the application and customer behavior. For example, A moderate web server will generate logs of size at least in GB s for a month period.
    • We cannot store this much of data into a relational database system. RDBMS systems can be very expensive and cheaper alternatives like MySQL cannot
      scale to the volume of data that is continuously being added.
    • A better solution is to store all the log files in HDFS which stores data on commodity hardware, so it will be cost effective to store huge volumes (TBs or PBs) of log files in HDFS and Hadoop provides Mapreduce framework for parallel processing of these files.

    Hadoop eco system sub components Pig, Hive support various UDF s that can be used to parse these unstructured log files and store them in structured format.

    Log File Processing Architecture:

    As hadoop supports processing of structured, semi structured and un structured data efficiently, Log files are the good real time examples of un structured data, and processing them through hadoop will be the best use case for hadoop in action.

    Below is the high level architecture of Log analysis in hadoop and producing useful visualizations out of it.

    As shown in the above architecture below are the major roles in Log Analysis in Hadoop.

    Flume Collection streaming log data into HDFS from various HTTP sources and Application Servers.

    HDFS HDFS is the storage file system for huge volumes of log data collected by flume.

    Pig Parses these log files into structured format through various UDF s.

    Hive Hive or Hcatalog will define schema to this structured data and schema will be stored in hive metastore.

    Hunk Search processing and Visualization tool that provides connectivity to Hive server and metastore and pull the structured data into it. On top it we can build various types of visualization charts. For more details on this connectivity to hive and visualizations on top of it refer the post Hunk Hive connectivity .

    Tableau It is a visualization tool that provides connectivity to Hive server. For more details on this refer the post Tableau connectivity with Hive .

    Sample Use Cases for Log File Analysis:
    • Ranked list of the most significant trends and page views over the last day/week/month
    • Identify the characteristics of the most active users to help promote this behavior across the board
    • Co-relate user sessions with transactional data to better understand how to improve sales

    In the next post, we will discuss about loading parsing web server logs and custom application logs using pig and making them structured to be ready for defining schema in hive/hcatalog.

    Share this:

    Do You need Big Data Job Support ?

    Next Spark Batch Starting From 3rd April

    Course Includes Below topics

    4. Spark Streaming

    Detailed Spark Course Contents at link http://hadooptutorial.info/spark-course-contents/

    Limited Seats available, please register by calling on +91-9704231873 (Whatsapp) and make initial payment to confirm your registration.


    Posted In: NEWS

    Tags: , ,

    Leave a Comment

    Apache Tomcat monitoring and management plugin #apache #tomcat #performance #monitoring, #tomcat


    Apache Tomcat monitoring plugin

    Verax NMS Apache Tomcat management plugin enables easy monitoring, alerting, health check, management and performance reporting for Apache Tomcat servers (server version 4.x with Java version 1.5 or higher are supported). JMX is used as communications protocol.

    Back to Verax NMS APM
    product page

    General information view

    The view provides general configuration information for a Tomcat instance, such as:

    • Operating system platform and version.
    • JVM information (version, vendor, 32- or 64- bit).
    • Average response times and control ports.
    • Summary of busy and current threads per each Tomcat connector.


    The applications view provides detailed information about installed, running and suspended applications. Usage summary including current sessions, peak sessions, servlet performance, application status, invalid requests and other statistics is available for each running application.

    The view lists all configured connectors including information such as: TCP port, protocol used (e.g. HTTP), security status (secure or non-secure connection), redirect port and maximum size allowed for POST operation. The view also provides information about currently configured thread pools including information such as: number of threads that are currently busy, total threads created by the connector and maximum spare threads.

    Request processors view

    This view provides detailed information about request processors configured in the Apache Tomcat application server including: URI, worker, total requests serviced, number of failed requests, bytes sent and received, maximum and total processing time and others.

    JVM view

    This view presents statistics on various JVM aspects including:

    • Runtime:
      • Process name and PID for the JVM.
      • JVM identification string (e.g. Java HotSpot Server 19 VM).
      • Start time and startup options: command line arguments, boot class paths and library paths.
    • Memory memory pools:
      • Heap and non heap memory usage.
      • Memory consumed by individual memory pools.
      • Detailed list of garbage collectors with name, number of collected objects, collection time and collector status (valid or invalid).
    • Threads:
      • Thread statistics: current number, peak number, total number of threads started since server startup, number of classes loaded/unloaded.
      • Detailed list of all threads with their name, state (runnable, blocked, waiting, etc.), block and wait counts, CPU time and deadlock status.

    Predefined Apache Tomcat monitoring templates

    The plugin provides predefined templates for most commonly monitored Apache Tomcat items (listed in the table below). Other, user-defined sensors and performance counters can be added.

    Apache Tomcat monitoring templates


    Posted In: NEWS

    Tags: , , , , , , , , , , , ,

    Leave a Comment

    Database Health Monitor #server #monitor #software


    Does Your SQL Server Have Performance Issues?

    Find out with the database health monitor

    Database Health Monitor is a powerful performance monitoring and diagnostics solution that shows administrators server health, performance or availability problems within their SQL Server environment, all from a central console.

    Helping Administrators Perform Monitoring Faster

    From the historic wait stats monitoring the instance level reports that give a quick overview of your SQL Server, the Database Health Monitor helps you monitor and quickly understand the health of your database. The performance monitoring is built for people with one or hundreds of SQL Servers to quickly check on the status, find problems, and to remedy those problems.

    Save time database tuning with the Database Health Reports

    The Database Health Monitor is a free tool for you to use to monitor your SQL Server environments. There is no limit to the number of servers you can monitor. Download Now

    What s in the tool?

    Database Health Monitor is a tool built by Steve Stedman of SQL Data Partners to help SQL Server administrators find the performance issues or bottlenecks on SQL Server. The tool started as a way to help gather metrics for our own environments in 2010 and we have added to it and ensure it continues to work in the latest versions of SQL Server. The tool is now available for free.

    We know you will benefit from using this tool to monitor your environments. It this works for you and helps, great! If you would be so kind and tell others about, you have our thanks.

    Download Database Health Monitor and do all your own performance monitoring.
    Download for free now…

    If you need help, we offer a performance evaluation to review the results of the Database Health Monitor. The review process can be as simple as a few hours or we can dig in and provide database tuning for your system. If you think you want to work with us, simply schedule a short meeting normally 30 minutes, to review what you have in mind.

    Database Health Version 2.5.4


    Posted In: NEWS

    Tags: , ,

    Leave a Comment

    RAID Enclosures #server #raid #systems, #raid #hard #drive #storage, #multilane #storage,


    Newegg.com – A great place to buy computers, computer parts, electronics, software, accessories, and DVDs online. With great prices, fast shipping, and top-rated customer service – once you know, you Newegg.

    If you are reading this message, Please click this link to reload this page.(Do not use your browser’s “Refresh” button). Please email us if you’re running the latest version of your browser and you still see this message.

    If you see this message, your web browser doesn’t support JavaScript or JavaScript is disabled. Please enable JavaScript in your browser settings so Newegg.com can function correctly.

    • Feedback


    Free 30-Day Trial when you sign up for a 12-month membership. Sign up and start enjoying:

    Free 3-Day-or-sooner expedited shipping on qualifying items.

    Add up to four friends to your account so they can enjoy your great Newegg Premier benefits.

    Free Returns with No restocking fee

    Free shipping on returns and waived restocking fee for qualifying items.

    Dedicated Customer Service

    Need quick assistance? Use our private customer service line to help answer any questions or concerns.

    RAID Enclosure / Subsystems

    Mediasonic HFR7-SU3S2 PRORAID Box 4 Bay Raid 3.5 SATA Hard Drive Enclosure with USB 3.0 eSATA

    • Drive: SATA
    • OS Supported: Windows XP / VISTA 32 / 64-Bit / Windows 7 / Windows 8, 8.1 / 10 (with MBR enabled, supports total capacity up to 2TB) Windows Vista / 7 / 8 / 8.1 / 10 32 / 64-Bit (with GPT enabled, supports total capacity more than 2TB) Mac OS 10.3 or later
    • Power Supply: 100V to 240V
    • Dimensions: 6.54″ x 4.96″ x 8.46″
    • Model #: HFR7-SU3S2
    • Item #: N82E16816322029
    • Return Policy: Standard Return Policy

    CineRAID CR-H458 Hardware RAID 0, 1, 10, 3 and RAID 5, Support 4 x 3.5 Hot Swappable Drive Bays USB 3.0 eSATA RAID SubSystem

    • Drive: SATA
    • OS Supported: Windows 7 / Windows Vista / XP Home Edition SP-1, Professional SP-1 / Windows 2000 Professional SP-3 / Windows ME / Windows 98SE or later versions Mac OS X 10.3 or later (recommended 10.3.9 or later) Linux 32-bit / 64-bit
    • Power Supply: 1 x H458 Enclosure 1 x eSATA Cable 1 x USB 3.0 Cable 4 x HDD Screw-less Trays 1 x Power Adapter 1 x Power Cord 1 x User manual and Software CD
    • Dimensions: 6.50″ x 6.50″ x 9.00″
    • Model #: CR-H458
    • Item #: N82E16816856039
    • Return Policy: Standard Return Policy

    Mediasonic HUR2-SU3 2 3.5 USB 3.0 UASP ProBox dualbay docking station 2.5 3.5 HDD clone function

    • Drive: SATA
    • OS Supported: Windows 7 / 8.1 (with MBR enabled, supports total capacity up to 2TB) Windows 7 / 8 (32 / 64 bits) (with GPT enabled, support total capacity more than 2TB) Mac OSX 10.8 or later
    • Power Supply: Input: 100

    HighPoint RocketStor 6418AS – 8-Bay 6 Gb s SAS SATA Hardware RAID Tower Enclosure

    • Drive: SAS & SATA
    • OS Supported: Windows: Windows 2008 and Windows 7 and later Linux: RedHat Enterprise, Open SuSE, Fedora Core, Debian, Ubuntu / Linux Driver embedded into Kernel 3.9.4 and later FreeBSD: Driver embedded into FreeBSD 9.0 and later Mac OS X: Mac OS X 10.6 and later (Driver embedded into Mac OS X 10.9 and later)
    • Temperature: Operating: 5 degree C – 45 degree C Non-Operating: -40 degree C – 65 degree C
    • Humidity: Operating 8% – 90% RH (Non-condensing) Non-operating 5% – 95% RH (Non-condensing)
    • Model #: RocketStor 6418AS
    • Item #: N82E16816115189
    • Return Policy: Standard Return Policy

    Mediasonic ProRaid HUR6-SU31 RAID 0, 1, JBOD USB 3.1 Gen-II Type-C ProRaid 2 Bay 2.5 SATA HDD SSD Enclosure – USB 3.1 Gen-II Type-C

    • Drive: SATA
    • OS Supported: Windows 7 / 8.1 / 10 (32 / 64-bit) (MBR Support up to 2TB, and GPT support more than 2TB) Mac OS 10.8 or later The product is Plug and Play; therefore, no driver is required for Windows 10 / 8.1 / 7 or Mac OS.
    • Power Supply: AC Power Adapter: (Optional) No Power Adapter is required if PC / Mac can supply enough power to the unit. If PC or Mac cannot supply enough power to the unit, power adapter is required to plug into the unit
    • Dimensions: 6.69″ x 13.78″ x 9.84
    • Model #: HUR6-SU31
    • Item #: N82E16816322031
    • Return Policy: Standard Return Policy

    Sans Digital 4 Bay eSATA Port Multiplier JBOD Tower Storage Enclosure no eSATA Card bundle TR4M BNC

    4TB HDD Supported, up to 16TB, eSATA Port Multiplier


    Posted In: NEWS

    Tags: , , , , , , , , ,

    Leave a Comment

    Monitor Network Speed and Uptime with a Server Uptime Monitor #server


    Server Uptime Monitor

    sMonitor is a server uptime monitor for your business. Since many businesses these days offer services based on application services to their customers, it is extremely important that network administrators know that all of their essential servers are up and running. sMonitor will inform you of this. sMonitor runs as a standard Windows application or as a Windows service. Since it can run as a Windows service, it can also run in the background even if no one is logged into the computer. It monitors all network servers and informs network administrators by using either an SMTP email service or a GSM SMS messaging service if a server comes online, goes offline or has its running state modified in any other way. This way, network administrators will always know about the online status of their servers in minimal time.

    The main sMonitor window, whether being displayed on not, is mirrored by a constantly updated HTML file which keeps a log of all monitored activity. This file is fully customizable from within the program and it may be uploaded to any remote Web server that uses the FTP protocol. A built-in script interpreter interacts with remote systems using either a modem or the telnet service to launch third-party software.

    Additional features of sMonitor include its support for ICMP, UDP and TCP protocols. It can also send protocol-specific UDP requests to ports 67 (DHCP), 123 (NTP), 137 (NETBIOS-NS), 161 (SNMP) and 389 (LDAP). sMonitor can also provide alerts of a server’s uptime status by way of alarms, both audible and visible. Network administrators can receive customized email messages from the software. sMonitor also creates log files in the plain text format and the universally supported CSV (Comma Separated Values) format. Files are sorted by servers and services. More advanced users may also create custom scripts and debug them as required. The server uptime monitor provides a wealth of unique features making it one of the top-end solutions of its kind. It has many more features and a wide selection of different notification features when compared to similar programs.

    The standard method of checking a server’s uptime using the Windows operating system is simply a matter of using the ping command. However, this is extremely limited and offers absolutely no logging or automated functionality, making it useful only in very specific purposes. When you need to monitor the uptime of one or more servers constantly, sMonitor presents an essential solution that every network administrator will find use for. Another standard solution is the PortQry utility available from Microsoft. It offers more sophisticated functionality than the ping command, but it is still limited and there is no user-friendly GUI provided since it is simply a command line utility. Neither of these solutions provide any functionality to automatically alert the user as to the uptime status of a server, however.

    You can learn more about this server uptime monitor at http://www.yarovy.com/smonitor. A list of features is available here along with screenshots of the working software.

    • Runs as a standard application or a Windows service.
    • Runs as a system tray application automatically on startup.
    • Supports ICMP, TCP and UDP.
    • Sends protocol-specific UDP requests to the ports: 67 (DHCP), 123 (NTP), 137 (NETBIOS-NS), 161 (SNMP), 389 (LDAP).
    • Allows server lists to be saved and opened as user specified files.
    • Alerts by audible and visible alarms.
    • Allows SMTP (SSL) mail and SMS notifications when a service is down, up, or its state is changed.
    • Allows custom e-mail and SMS message format to be created.
    • Launches third-party applications.
    • Creates an HTML status file which mirrors the main sMonitor window.
    • Allows the HTML file to be customized and to be uploaded by FTP to a remote web server.
    • Contains a built-in script interpreter which interacts with remote systems by modem and telnet.
    • Allows customs scripts to be created and debugged.
    • Generates plain text log files and CSV format files sorted by servers and services.

    Are you in need of a tool that can constantly verify your network connectivity? Try sMonitor. It is a dependable application that periodically checks your TCP and UDP ports. This smart piece of software can be incredibly useful so why not try it right now?


    Posted In: NEWS

    Tags: , , , , , , , , , , , ,

    Leave a Comment

    Application Monitoring With Applications Manager, Software for Applications Monitoring #application #monitoring,


    Application Monitoring

    Applications Manager provides in depth monitoring of web applications, be it a CRM application, banking / finance application or any business critical application. Applications Manager can also help monitor the underlying infrastructure which may consist of application servers, databases, systems, mail servers and other Java/J2EE Applications.

    Applications Manager helps ensure higher uptime by detecting and diagnosing problems of application servers and their services faster.

    The Applications Manager tool has assisted our small team of solution providers to monitor our applications independently of the multitude of other services offered within our corporate network. Setting up our monitors has been a relatively easy process, and where we got stuck we have benefited from the support provided by the local partner in South Africa (ONSoft) and your ManageEngine support team. The additional functionality offered by SLA manager will assist us to validate system availability to our internal customers.

    Tim Jobson
    Shared Solutions Support Team
    Metropolitan Life – South Africa

    Applications Manager enables high performance business service management by detecting and diagnosing problems of application servers and their services faster. Application Server monitoring includes support for the following application servers:

    JBoss is one of the most popular and fully compliant open source J2EE application server. Applications Manager helps you in monitoring performance, availability, and usage statistics for JBoss Servers. JBoss servers are monitored based on the attributes such as JVM heap Usage, Response time, etc. and the different web applications and EJBs deployed in the server.

    Tomcat, a leading servlet engine from the Apache Software Foundation is one of the most popular open source projects. Applications Manager helps you in monitoring performance, availability, and usage statistics for Tomcat Servers.

    Gather details of the Tomcat Server parameters such as JVM, Web Applications details, EJBs, Servlets, Thread Pools, database connection pools, etc.

    WebLogic Versions: 6.x, 7.x, 8.x, 9.x

    Applications Manager helps in managing BEA WebLogic Servers.

    Gather insight into WebLogic server parameters such as JVM, Web Applications details, EJBs, servlets, thread pools, database connection pools, etc.

    Applications Manager helps in managing IBM WebSphere servers.

    Gather performance data of WebSphere Server parameters such as JVM, Web Applications details. EJBs, Servlets, Thread Pools, database connection pools, etc.

    Applications Manager helps in monitoring SilverStream Servers.

    Gather insight into SilverStream server parameters such as thread pools, Response times, Memory, Load details, etc.

    Applications Manager helps in managing GlassFish Application Servers.

    Gather performance data of GlassFish Server parameters such as Session details, Memory, Thread details, etc.

    Applications Manager, a tool for monitoring the performance and availability of Microsoft .NET.

    Gather insight into Microsoft .NET parameters such as memory, thread pools, locks, exceptions, connections, etc.

    Applications Manager, a tool for monitoring the performance and availability of Oracle Application Server

    Gather insight into Oracle Application Server parameters such as OPMN Process Memory Stats, EJB Stats etc.

    By monitoring your application servers, ensure that your applications are performing well and are available all the time. This includes the monitoring of EJBs, JSPs, Servlets, JTA, JNDI, JDBC, etc. Current and historical performance data can be viewed through comprehensive graphs and reports. Root cause analysis helps in drilling down to the problem areas and fixing them before they affect end users.

    Trusted by


    Posted In: NEWS

    Tags: , , , , , , , , ,

    Leave a Comment

    OneProvider – Dedicated servers in Toronto, Canada #hosting, #dedicated #server, #dedicated


    value=”http://oneprovider.com/”>Home value=”#”>Why? value=”/hosting-provider/why-one-provider”> Why OneProvider? value=”/hosting-provider/onepanel”> OnePanel™ value=”/dedicated-servers-locations”>Dedicated servers value=”/dedicated-servers-locations”> All Server Locations value=”/dedicated-servers-in-north-america”> North America value=”/dedicated-servers-in-europe”> Europe value=”/dedicated-servers-in-asia”> Asia value=”/dedicated-servers-in-south-and-central-america”> South and Central America value=”/dedicated-servers-in-oceania”> Oceania value=”/dedicated-servers-in-africa”> Africa value=”/dedicated-servers/dedicated-server-promotions”>Promotions! value=”/dedicated-servers/dedicated-server-promotions”> Current Promotions value=”/dedicated-servers/clearance-deals”> Clearance Deals value=”/onecloud”>OneCloud™ value=”/onecloud/ssd-virtual-servers”> SSD Virtual Servers value=”/onecloud/students”> OneCloud for Students value=”#”>Complex solutions value=”/complex-dedicated-hosting-solutions/complex-solutions”> Complex Solutions value=”/complex-dedicated-hosting-solutions/colocation”> Colocation value=”#”>Support value=”/support/support-center”> Support Center value=”/support/vip-support”> VIP Support value=”/support/frequently-asked-questions”> F.A.Q. value=”/about-us”>About us value=”/about-us/terms-of-service”> Terms of Service value=”/about-us/acceptable-usage-policy”> Acceptable Usage Policy value=”/about-us/service-level-agreement”> SLA value=”/about-us/privacy-policy”> Privacy Policy

    Can’t find what you’re looking for? Try one of these nearby locations!

    • Montreal, QC
      Starting at $25
    • McLean, VA
      Starting at $229
    • Washington, DC
      Starting at $39
    • Springfield, VA
      Starting at $249


    Posted In: NEWS

    Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

    Leave a Comment

    Managing Distribution Lists #distribution #lists, #exchange #server, #exchange #online, #office #365,


    Managing Distribution Lists in Hybrid Exchange Online/Office 365 Environments

    How to give Exchange Online users the ability to manage their distribution lists

    Microsoft has done a great job of ensuring that hybrid Exchange Online/ Office 365 tenants have almost all of the features and functionality of on-premises Exchange Server deployments, without the need for running and maintaining their own servers. However, there is still one major gap companies consistently run into as they move to a hybrid Exchange Online/Office 365 environment: distribution list (DL) management.

    In hybrid Exchange Online/Office 365 environments, DLs are created on premises and synchronized with Azure Active Directory (Azure AD) through DirSync so that users with mailboxes in Exchange Online can use those DLs. However, Exchange Online users can t manage their own DLs because DirSync doesn t support the write back of DLs from Office 365 to the on-premises Active Directory (AD) forest. Exchange has supported the delegation of DL management to users for several versions now, so it comes as a bit of a shock to IT departments when they find out that they have to manage DLs again.

    There are several solutions for this problem, which fall into two categories:

    • Move DLs to Azure AD so that Exchange Online users can manage them through Outlook. This can be accomplished through DL migration.
    • Find a way for Exchange Online users to manage on-premises DLs. By using Forefront Identity Manager, DSQuery, or a Windows PowerShell script, you can provide a way for users to do so.

    Although none of these solutions will get DL management to work the way it used to, I hope one of the solutions will fit your needs.

    DL Migration

    To move DLs to Azure AD, you can migrate your DLs from your on-premises AD forest to Azure AD. As I mentioned previously, users with mailboxes in Office 365 can t modify DLs that are synchronized with DirSync because DirSync doesn t support the write back of DLs from Office 365 to the on-premises AD forest. This solution solves that problem by creating new DLs in Azure AD that users with mailboxes in Office 365 will be able to modify.

    If your organization is going to stay in a hybrid configuration, with some mailboxes in Office 365 and some mailboxes in the on-premises AD forest, there s a downside to this solution: The on-premises users won t be able to use and modify the migrated DLs. In addition, if you ever decide to move back to an on-premises Exchange environment from Office 365, there s no easy way to migrate those DLs back to your on-premises AD forest.

    If you re moving all user mailboxes into Office 365 and you have a limited number of DLs to move, this might be a workable solution. If the plan is to maintain mailboxes on premises or if you have a large number of DLs, this solution turns into more trouble than it s worth.

    Forefront Identity Manager

    You can use Forefront Identity Manager to provide Exchange Online users the ability to manage their on-premises DLs. On the upside, Forefront Identity Manager gives users a nice graphical interface for managing DLs that s similar to the interface users have become accustomed to in Outlook. On the downside, Forefront Identity Manager is expensive to purchase. Depending on how you buy it, the licensing is going to be thousands of dollars. You also need to consider the cost of the server hardware and IT staff to configure and maintain the solution. As a result, few (if any) organizations are going to find user DL management important enough to justify a new deployment of Forefront Identity Manager. If your organization already has Forefront Identity Manager deployed, this is a nice added use for the solution.


    DSQuery, a command-line tool that has been around for some time, can provide Exchange Online users the ability to manage on-premises DLs. However, most users won t find it easy to manage DLs with this tool. Here are a couple of commands that illustrate this point:

    The first command shows the members of a DL named Sales – East. The second command shows all the mail-enabled DLs in an AD forest. Most Exchange Online users are going to be very unhappy if you tell them they need to run commands like this to manage their DLs.

    PowerShell Script

    The final option is to use a PowerShell script to allow Exchange Online users to manage their on-premises DLs. You can find a script I have written, Manage-DistributionLists. in the TechNet Gallery.

    The Manage-DistributionLists script gives Exchange Online users an interface to manage on-premises DLs, without them having to deal with the complexities of DSQuery. In my opinion, this is the best option for DL management. It keeps all your organization s DLs homed in your on-premises AD forest, but still allows your Exchange Online users to manage those DLs themselves. This script requires an on-premises Exchange server, but if you re using DirSync with your Exchange Office 365 tenant, it s highly recommended that you maintain a hybrid Exchange server on premises.

    Here s a quick walkthrough of how this script works. When you launch the script, it will ask for an Exchange server name, as Figure 1 shows.

    If you want to make life a little easier for your Exchange Online users, you can modify the script. First, find the following lines at the top of the script:

    Next, remove the first two lines. Finally, replace $ExchangeServer with the name of your hybrid server in the last line. With this modification, your users won t be prompted to enter an Exchange server name. Instead, they ll be automatically connected to it.

    After the connection is completed, users will be presented with the menu shown in Figure 2.

    As you can see, I tried to make the script as self-explanatory as possible. All the users need to do is select the option they want. For example, if they select the first option, they ll receive results that look like those in Figure 3.

    If you have any suggestions on how to improve the script, please let me know.

    Don t Overlook DL Management

    There are many things to consider when planning a migration to Office 365, and DL management is one facet that s fairly easy to overlook. I hope that I ve been able to clarify your options for DL management and make your migration a little bit easier.


    Posted In: NEWS

    Tags: , , , , , , , , ,

    Leave a Comment

    Microsoft Exchange Server Analyzer #monitor #exchange #server


    This documentation is archived and is not being maintained.

    Microsoft Exchange Server Analyzer – Articles

    [This topic is intended to address a specific issue called out by the Exchange Server Analyzer Tool. You should apply it only to systems that have had the Exchange Server Analyzer Tool run against them and are experiencing that specific issue. The Exchange Server Analyzer Tool, available as a free download, remotely collects configuration data from each server in the topology and automatically analyzes the data. The resulting report details important configuration issues, potential problems, and nondefault product settings. By following these recommendations, you can achieve better performance, scalability, reliability, and uptime. For more information about the tool or to download the latest versions, see “Microsoft Exchange Analyzers” at http://go.microsoft.com/fwlink/?linkid=34707 .]

    The articles in this section correspond to the rules and resulting messages that are generated by the following Microsoft® Exchange Analyzers:

    Exchange Best Practices Analyzer

    Exchange Troubleshooting Assistant

    The Exchange Troubleshooting Assistant provides access to the following functionalities:

    Exchange Performance Troubleshooter

    Exchange Database Troubleshooter

    Exchange Database Recovery Management

    Exchange Mail Flow Troubleshooter

    Each article describes a single error, warning, or non-default configuration message that could be detected by the Exchange Server Analyzer.

    The Microsoft Exchange Analyzers Web page at http://go.microsoft.com/fwlink/?linkid=34707 contains downloads for the current versions of Exchange Best Practice Analyzer and Exchange Troubleshooting Assistant. In addition, the page contains information about system requirements, and a link to a newsgroup for discussion about the analyzers.

    If you are running in an environment where you cannot use the automatic updating facility for Exchange Best Practices, you can access the Microsoft Exchange Best Practices Analyzer Web Update Pack Web page at http://go.microsoft.com/fwlink/?linkid=34290. Here you will be able to download the most recent configuration and Help files.

    If updates are being applied automatically, you do not have to use this download.


    Posted In: NEWS

    Tags: , ,

    Leave a Comment

    Build PXE Boot Server for Windows with CCBoot – a Powerful


    PXE Boot Server Brief

    PXE was introduced as part of the Wired for Management framework by Intel, it’s short for Preboot eXecution Environment or Pre-Execution Environment. Nutshell, PXE boot server is a combination of DHCP server and TFTP server. It responds requests from diskless stations over network, allocates IP addresses via DHCP for them, pushes necessary data to these stations so that they can boot on LAN even without a harddisk. CCBoot supports install windows 7 via pxe boot .

    How to configure windows pxe server. With PXE boot server. diskless computers booting process is as bellow:Power on – BIOS – PXE stack built-in the NIC (Network Information Center)- NBP (Network Boot Program) downloading from server to client’s RAM by TFTP- NBP’s responsibility to perform the next step (a.k.a. 2nd stage boot).

    Benifits of PXE boot server

    • 1, Reduce initial capital and implementation costs, reduce power and cooling requirements, reduce complexity and risk.
    • 2, Accelerate deployments, upgrades, and server repurposing.
    • 3, Implement enhanced Disaster Recovery solutions.

    As an administrator responsible for a network of dozens of computers or more, PXE boot server program will be very helpful. it can handle Windows XP, 2003, Vista, Windows 7 and 2008 booting. PXE boot server software will drastically reduce your daily workload. If you want to install or upgrade various softwares for all computers in the network, you just need to do operate the boot image, then all computers using this image can get changed after rebbot. It can also bring you enhanced Disaster Recovery solutions, all computers will get into a clean OS after reboot.
    In this article, we will introduce a Windows based PXE boot server software – CCBoot.

  • Install PXE Boot Server with CCBoot and Use It for Network Booting Windows

    1. Install PXE Boot Server with CCBoot
      Download PXE boot server software – CCBoot server installation package from – http://www.ccboot.com/download.htm .
      Launch ccbootsetup.exe on the PXE boot server and keep press the next button to the end.

    CCBoot will use the following ports – 67 (DHCP), 69 (TFTP), 3260 (iSCSI), 1000 (Image Upload), 8001 (Service Control). You need to open these ports in the firewall of the PXE boot server. Since CCBoot v2.1, you also need to open port 66. V2.1 uses port 66 as DHCP backup.

    Note: Please shut down the other DHCP services on the LAN especially the DHCP service in the router.

    Launch CCBoot and you will get the main interface as bellow:

  • Initialize the PXE Boot Server
    Demo Environment

    Server IP:
    DNS Address:
    IP Mask:
    DHCP Range:

    Launch the PXE boot server software – CCBoot, menu Options – Options Wizard and configure step by step as bellow:

    You need to select the correct local IP address as DHCP Server IP . Press Scan DHCP to check if there are other DHCP services on the LAN. You need to stop other DHCP services on the LAN.

    Set Server IP Address . Normally, it’s the same as DHCP Server IP .
    Set Write-back File Path and Image Save Path as you want.

    Write-back File Path is used to store the clients’ write-back data. You’d better use a big volume hard disk as Write-back File Path . This disk should be formatted as NTFS and 64K bytes per cluster.

    Image Save Path is used to store the boot images. This disk should be also formatted as NTFS and 64K bytes per cluster. You’d better use a fast speed hard disk as Image Save Path . For example, use an SAS hard disk.

    Keep default values in Server Cache Settings .

    Keep default values in Other Settings . Press the Finish button and confirm the popup dialog box.

  • Upload Image to The PXE Boot Server for Windows XP
    To network boot Windows XP with the PXE boot server software – CCBoot, we first of all need to create a system image and here’re the steps –
    1. Choose one client PC as master PC used to upload image to the PXE boot server. Attach a hard disk on the PC.
    2. Delete all partitions first. Allocate a small MBR partition about 40G size and leave the rest unallocated. Format the 40G partition with NTFS. Install Windows XP and the latest SP into this partition.
    3. After complete Windows installation, open the local area connection network properties and configure as bellow:

    Select Internet Protocol (TCP/IP) and click Properties .

    Select Obtain an IP address automatically and Obtain DNS server address automatically , then click OK to save.

  • On the PXE boot server, open CCBoot main window, you will find a client in the client list (Figure 10) that was added by CCBoot automatically when the client PC got IP address from the CCBoot DHCP service.
  • Double click the client to edit and check both Enable Upload Image and Keep Write-back File (Figure 11), when press save button it will ask you Are you sure to delete write-back file? Just press No .
  • Download CCBoot client installation package from: http://www.ccboot.com/download.htm. Launch ccbootsetupclient.exe and keep press the next button to the end. Then launch CCBoot client and you will see the main interface as bellow (Figure 12).
  • Press the Install CCBoot Client button. After finished, it will require reboot system. Reboot the client PC.
  • After reboot, launch CCBootClient again, input the correct Server IP address , it should be the IP address of the PC on which CCBoot server has been located. Input the image file name as you want in the Image File Name . Press the Upload Image button to upload the image to the CCBoot server. Then CCBoot will create an image in the server Image Save Path .

    Note: CCBoot supports two types image file format. It supports VMDK if you are using Windows 2003 as CCBoot server system. It will support both VMDK and VHD if you are using Windows 7 or Windows 2008. As you can see in Figure 12, the file format depends on what you have set for Image File Name . For example, XP01.vmdk and XP01.vhd .

  • Network Boot Windows XP from The PXE Boot Server
    1. On the PXE boot server, open CCBoot main window, double click PC101 (Figure 10) to open the master PC’s properties dialog box, uncheck Enable Upload Image and Keep Write-back File .
    2. Remove the HDD from the master PC, set it firstly boot from LAN (or network, PXE rom, or some similar settings) in BIOS settings so that it will start network booting Windows XP from the PXE boot server.(Figure 13).
    3. The first time booting Windows XP from the PXE boot server, you can modify its computer name (Figure 14).

      Set the computer name as you wish then press enter key to boot it (Figure 15).

    4. On CCBoot server, Options – Settings – Default Client Settings – Disk Group – press the button, select XP01.vmdk as the default boot image in System Image Selection section.
    5. Do the same as Step 2 and Step 3 for other client PCs with the same specifications as the master PC to network boot Windows XP for them.

  • PXE Boot Server Used for Windows 7 and Vista
    PXE Boot Server for Windows 7.
    PXE Boot Server for Vista.
  • 2000-2017 Youngzsoft. All Rights Reserved.


    Posted In: NEWS

    Tags: , , , , , , , , ,

    Leave a Comment

    Windows Server Backup Step-by-Step Guide for Windows Server 2008 #windows #server


    Windows Server Backup Step-by-Step Guide for Windows Server 2008

    The Windows Server Backup feature provides a basic backup and recovery solution for computers running the Windows Server® 2008 operating system. Windows Server Backup introduces new backup and recovery technology and replaces the previous Windows Backup (Ntbackup.exe) feature that was available with earlier versions of the Windows operating system.

    The Windows Server Backup feature in Windows Server 2008 consists of a Microsoft Management Console (MMC) snap-in and command-line tools that provide a complete solution for your day-to-day backup and recovery needs. You can use four wizards to guide you through running backups and recoveries. You can use Windows Server Backup to back up a full server (all volumes), selected volumes, or the system state. You can recover volumes, folders, files, certain applications, and the system state. And, in case of disasters like hard disk failures, you can perform a system recovery, which will restore your complete system onto the new hard disk, by using a full server backup and the Windows Recovery Environment.

    You can use Windows Server Backup to create and manage backups for the local computer or a remote computer. You can also schedule backups to run automatically and you can perform one-time backups to augment the scheduled backups.

    Windows Server Backup is available in all editions of Windows Server 2008 (both 32-bit and 64-bit versions). However, the Windows Server Backup snap-in is not available for the Server Core installation option of Windows Server 2008. To run backups for computers with a Server Core installation, you need to either use the command line or manage backups remotely from another computer. In addition, Windows PowerShell is not available for the Server Core installation option, so the cmdlets for Windows Server Backup are also not available on this type of installation.

    Windows Server Backup includes the following improvements:

    • Faster backup technology. Windows Server Backup uses Volume Shadow Copy Service (VSS) and block-level backup technology to back up and recover your operating system, files and folders, and volumes. After the first full backup is created, you can configure Windows Server Backup to automatically run incremental backups by saving only the data that has changed since the last backup. Even if you choose to always perform full backups, your backup will take less time than it did in earlier versions of Windows.
  • Simplified restoration. You can restore items by choosing a backup and then selecting specific items from that backup to restore. You can recover specific files from a folder or all the contents of a folder. In addition, previously, you needed to manually restore from multiple backups if the item was stored on an incremental backup. But this is no longer true—you can now choose the date of the backup version for the item you want to restore.
  • Simplified recovery of your operating system. Windows Server Backup works with new Windows recovery tools to make it easier for you to recover your operating system. You can recover to the same server—or if the hardware fails, you can recover to a separate server that has similar hardware and no operating system.
  • Ability to recover applications. Windows Server Backup uses VSS functionality that is built into applications like Microsoft® SQL Server® to protect application data.
  • Improved scheduling. Windows Server Backup includes a wizard that guides you through the process of creating daily backups. System volumes are automatically included in all scheduled backups so that you are protected against disasters.
  • Offsite removal of backups for disaster protection. You can save backups to multiple disks in a rotation, which enables you to move disks from an offsite location. You can add each disk as a scheduled backup location and, if the first disk is moved offsite, Windows Server Backup will automatically save backups to the next disk in the rotation.
  • Remote administration. Windows Server Backup uses an MMC snap-in to give you a familiar and consistent experience for managing your backups. After you install the snap-in, you can access this tool through Server Manager or by adding the snap-in to a new or existing MMC console. Then, you can manage backups on other servers by clicking the Action menu in the snap-in, and then clicking Connect to Another Computer .
  • Automatic disk usage management. After you configure a disk for a scheduled backup, Windows Server Backup automatically manages the disk usage—you do not need to be concerned about running out of disk space after repeated backups. Windows Server Backup will automatically reuse the space of older backups when creating new backups. The management tool displays the backups that are available and the disk usage information. This can help you plan for provisioning additional storage to meet your recovery objectives.
  • Extensive command-line support. Windows Server Backup includes the wbadmin command and documentation, which enable you to perform all of the same tasks at the command line that you can perform by using the snap-in. For more information, see the Command Reference (http://go.microsoft.com/fwlink/?LinkId=93131 ). You can also automate backup activities through scripting.

    In addition, Windows Server 2008 contains a collection of Windows PowerShell™ commands (cmdlets) for Windows Server Backup that you can use to write scripts to perform backups. For more information, see http://go.microsoft.com/fwlink/?LinkId=93317 .

  • Support for optical media drives and removable media. You can manually back up volumes directly to optical media drives, such as DVD drives, and also to removable media. This offers a solution if you want to create backups that can easily be moved offsite on a one-time basis. This version of Windows Server Backup retains support for manual backups to shared folders and hard disks.


    Posted In: NEWS

    Tags: , , ,

    Leave a Comment

  • T-SQL String Manipulation Tips and Techniques, Part 1 #t-sql, #database #development,


    T-SQL String Manipulation Tips and Techniques, Part 1

    T-SQL is a language that was mainly designed to handle data manipulation tasks. Not much effort and attention were given to other kinds of tasks, such as string manipulation. Therefore, when you do need to manipulate strings in T-SQL, it can sometimes be quite challenging even for seemingly simple tasks. This article is the first of a two-part series in which I cover several common string manipulation needs. I d like to thank the following people who provided input regarding this topic: Ruben Garrigos, Kevin Boles, Fabiano Amorim, Milos Radivojevic, Peter Larsson, and Davide Mauri.

    Counting Occurrences of a Substring Within a String

    This code returns the value 3, indicating that the substring hello appears three times in the string abchellodehellofhello .

    The steps in this technique are:

    Because the solution is in the form of a single expression, it can be applied as part of a query using the underlying table s or view s attributes as inputs.

    Exactly N Occurrences of a Substring Within a String

    One obvious way to achieve this is to use the previous technique to count occurrences of a substring in a string, like so:

    Another way to handle the task is to use the LIKE predicate, like so:

    With both techniques, the validation is handled by a single expression. Therefore, you can easily embed the expression in a query filter or a CHECK constraint in a table definition.

    Replacing Multiple Contiguous Spaces With a Single Space

    ) or even multiple characters (e.g.

    (token plus space).

  • Replace in the result of the previous step each occurrence of
    (space plus token) with (an empty string).
  • Replace in the result of the previous step each occurrence of

    (token plus space) with (space).

  • All this translates to the following T-SQL expression:

    The output of this code is: this is a string with lots of spaces.

    Replacing Overlapping Occurrences

    The next challenge was brought up by Davide Mauri; he initially found it in the Italian SQL Server forums. It involves a certain misconception that some people have regarding the way the REPLACE function works. Consider the following code. Before executing it, see if you can guess what the output will be:

    Some people intuitively think that the output should be .y.y.y.y. however, it s actually .y.x.y.x. The reasoning behind the actual result is that REPLACE considers nonoverlapping occurrences from left to right. If we represent the string .x. with the symbol A, you can express the source string as AxAx.; then, replacing each occurrence of A with .y. gives you .y.x.y.x..

    Suppose you want to handle the replacement task while considering overlapping occurrences. One way to achieve this is to nest two calls to the REPLACE function, like so:

    Another option is replacing each occurrence of the separator with two, then applying the originally intended replacement, then replacing each two occurrences of the separator with one. Here s the expression in T-SQL:

    String Formatting Numbers With Leading Zeros

    There are several tasks related to formatting numbers as strings that people often inquire about. T-SQL perhaps isn t the best place to handle those, but I ll still present solutions mainly for their technical value.

    The output of this code is -0000001759.

    Another solution involves converting the absolute input value to a character string, concatenating a string with nine zeros (000000000) with the result of the previous step, extracting the 10 rightmost characters from the result, and finally adding the minus sign in front if the input is negative. Here s the expression that implements this logic:

    SQL Server 2012 (formerly code-named Denali ), is planned to support a function called FORMAT that will make your life really easy when it comes to formatting strings representing numbers and other types of values. The FORMAT function accepts a format string compatible with .NET format strings for similar purposes, indicating how you want to format the value. For our specific task, the expression is very simple:

    Left Trimming Leading Occurrences of a Character

    The technique to achieve this is quite straightforward when the source string doesn t contain spaces to begin with. You first replace each occurrence of a zero with a space, then apply LTRIM to remove leading spaces, then replace each occurrence of a space with a zero, like so:

    This code returns 1709.

    If spaces are allowed in the input, you want to first replace each existing space with a token that you know can t appear in the source data (e.g.

    ), then apply the previous technique, then replace each occurrence of the token with a space, like so:

    This code returns 1709 ABC.

    This just gives you an idea of dealing with one complication. Of course there can be several additional complications, such as support for negative values, spaces at the beginning, and so on.

    Checking That a String Is Made of Only Digits

    One solution in which you don t need to worry about special cases is to generate a string pattern in which you replicate the character pattern [0-9] (basically, a digit) as many times as the number of characters in the input string, like so:

    The problem with this approach is that if the input strings are very long, the pattern generated will be five times as long as the input.

    There s another solution that also doesn t require you to deal with special cases, and it uses a very short pattern. You use two levels of negation, verifying that the string doesn t contain any character that s not a digit, like so:

    Just like with the previous techniques I described, the last two validation techniques implement the logic using a single expression. This means that you can embed the predicate wherever predicates are supported in T-SQL, operating on a table or view attribute. For example, if you need to enforce this rule in a CHECK constraint as part of a table definition, you can, like so:

    Make the Most of T-SQL s Tools

    I covered several common string manipulation tasks: counting occurrences of a substring within a string, verifying that there are an exact number of occurrences of a substring within a string, replacing multiple contiguous spaces with a single space, replacing overlapping occurrences, formatting strings representing numbers with leading zeros, left trimming leading character occurrences, and verifying that a string is made of only digits. As you saw, T-SQL doesn t provide very rich capabilities in this area. You therefore need to be a bit creative with the limited set of tools that you do get. Fortunately, SQL Server Denali improves support for string manipulation by adding new functions, among them the FORMAT function that lets you format an input based on a format string.

    Next month, I ll continue the discussion about string manipulation, and I ll cover additional challenges related to string manipulation involving tricky type conversions. If you have your own cool and creative techniques to handle common string manipulation needs, I d love to hear about them.


    Posted In: NEWS

    Tags: , , , ,

    Leave a Comment

    Best VPS HOSTING – Virtual Private Servers – June 2017 #best


    VPS Hosting

    In VPS hosting. which stands for Virtual Private Server hosting, a physical server is divided into several logical virtual servers, each of which is private. This means that, contrary to shared hosting. you get full root access to your server, no other user can sneak into your account or files. You can also fine tune all configuration files and you can even run any linux program that you may want to.

    VPS hosting is almost as good as dedicated hosting. The only drawback is lower performance since the physical resources (processor, hard drive, etc.) are shared between several VPS instances. The advantage, though, is that you get all the security and flexibility of a dedicated server, at just a fraction of the price.

    The main caveats in choosing your VPS hosting provider, are:

    • Performance: this depends as much on the actual specs of the physical servers the hosting company uses as on how many virtual servers the hosting provider decides to run on each physical server. This information is very difficult to get by.
    • Technical support: with root access to your server, it is pretty easy to mess things up. In those cases, quality and reactive technical support make a huge difference from one host to another.

    Below is a list of VPS hosting companies we have successfully run b2evolution blogs and forums on.

    Top recommended VPS hosting plans

    Learn more about VPS hosting

    We receive a lot of feedback and comments from people purchasing a webhosting account for the first time. One of the most misunderstood events is customer verification. Here is the why and how about this sensitive topic.

    Why do customers get verified?

    Web hosting is a very legitimate need for most individuals and businesses. Unfortunately it is also an urgent need for hackers and spammers of many kinds. These people will always try to sign up for cheap hosting accounts, especially if there is an x months for free promo. Worse: they will also sign up for more expensive hosting accounts using stolen credit card info!

    This is why hosting companies have to remain vigilant about every sign up they get. Unfortunately, this is why you may also have to provide additional info upon signup, which could extend your signup process from a few minutes to a couple of days. For your own safety though.

    So you re ready to start your own website? Congratulations! Here s five easy steps to get started without wasting any time!

    Step 1: How does it work?

    In order for your website to be available to anyone at anytime, you need to host it with a web host . In other words: you need a web server leased to you by a web hosting company .

    Most hosting companies offer all-inclusive packages by default, which include:

    • a domain name,
    • the web-hosting itself
    • several email addresses/mailboxes.

    Thus, if you re just starting out, it is generally a good idea to get both your domain and your hosting in one single sign up with a hosting company. (Depending on your needs, you may or may not want to also take advantage of the email addresses.)

    Step 2: Which kind of web site do you want?

    Think about your smartphone for a minute: you can enter notes into the basic existing apps, but you can also

    Hosting plan details

    « InMotion Hosting’s Virtual Private Servers are a great alternative to dedicated servers and perfect for those who need more than what a shared account can offer. They have everything necessary to be a complete separate server, including optional root level access.

    ALL Packages include a minimum of: Free SSDs, Free Backups, SSH Access, 4GB RAM, 60GB Storage, 2 TB of Bandwidth, and Free cPanel License »

    Up to the comparison chart

    Our custom-built servers and 24/7 team of experts deliver breakthrough performance that grows with you. Get up and running in seconds. Whereas most VPS solutions take hours or days to activate, we designed our VPS servers to provision immediately.

    Extreme performance • Instant provisioning • Guaranteed resources • Root access • CentOS

    Up to the comparison chart

    You will be able to control all aspects of your VPS through our proprietary VPS management portal making managing your virtual private server a piece of cake.

    SSD Storage • Tier-1 Bandwidth • FREE Quick Provision • 300% Powered by Renewable Energy • Provisioned on Blacklist Free IPs • cPanel/WHM License Included ($200/yr value) • FREE Migration Service • FREE Nightly Backup • 99.9% Service Uptime

    Up to the comparison chart

    Lunarpages offers more powerful VPS solutions for your website, business, blog more with 24/7 real technical support. Get a semi-dedicated server that’s easy to manage and easy to afford.

    Linux (CentOS/Ubuntu) / Windows • Host UNLIMITED Sites • Root SSH Access • Managed Hosting • 99.9% Uptime • 30 Day Money Back Guarantee • Guaranteed RAM

    Up to the comparison chart

    « VPS hosting is not created equal. With Liquid Web, you’ll enjoy 100% US-based Heroic Support®, 24/7/365 phone and chat support, and a 59-Second support SLA.

    Engineered for peace of mind with CloudFlare® CDN, built-in backups, enhanced security, and DDoS Attack Protection. »

    Up to the comparison chart

    « Lightning-fast SSD servers starting from $8/mo.
    Full root access, free backups. and free migration assistance.

    100% SSD • 100% Uptime Guarantee • Free 10 Gbps DDoS Protection • SolusVM Control Panel

    We currently offer VPS hosting servers in Los Angeles, Chicago, Miami, and Dallas. Choose the server closer to you for faster speeds to you and your visitors. »

    Up to the comparison chart

    Want alternatives?

    • If you don’t actually need to fine tune your configuration or if you are looking for a lower budget hosting solution, check out our selection of shared hosting plans .
    • If your have a higher budget, we actually recommend that you consider a small cheap dedicated server which will give you significantly better performance. For example, you may get 10 times the performance of a VPS for only 3 times the price of a VPS.

    Hosting options


    Posted In: NEWS

    Tags: , , , , ,

    Leave a Comment

    Backup & Recovery Software: Virtual Server, Data Protection Analytics #backup #and


    Data Protector and Backup Navigator

    Data Protector Trial

    Data Protector

    HPE Data Protector, our core data protection engine, provides high performing backup and recovery across various data repositories, applications and remote sites in physical and virtual environments. It standardizes and consolidates backup and recovery processes so that businesses can improve reliability, gain business resiliency, and reduce cost and complexity of backup operations. Data Protector is the foundation of the HPE Adaptive Backup and Recovery (ABR) suite which includes Storage Optimizer for analyzing, classifying and managing data based on its value, and Backup Navigator for optimizing the backup environment by using operational analytics and insight. Together, this suite enables our customers to gain a 360 degree view of their backup environments to constantly tune and adapt to deliver optimal results. Read more .

    Data Protector Key Capabilities

    Comprehensive Enterprise Data Protection

    Simplify and standardize data protection across heterogeneous environments, applications and media. Extensive support matrix simplifies integrations with 3rd party systems and solutions and eliminates the need for multiple point products.

    Built-in Disaster Recovery

    Automate disaster recovery with centralized bare metal recovery from physical to physical, physical to virtual, virtual to virtual, and virtual to physical from any backup set at no additional cost.

    Application Recovery

    Ensure granular recovery with native integrations with core enterprise applications and databases to extend backup, automate point-in-time recovery, and enable application owners to manage, drive and service their own backup and recovery requirements.

    Snapshot Protection

    Rapidly and efficiently recover your data thanks to array-based snapshot integrations while removing the burden that traditional backup technologies have on the production environment.

    Advanced Virtual Server Protection

    Protect your data in virtualized environments with major hypervisor integrations, tiered recovery options, process automation, and analytics and visualization for virtual environments.

    Storage Optimization

    Reduce storage resources needed and cut costs with federated deduplication and a single deduplication engine across the portfolio on the application source, backup server or target system with StoreOnce Catalyst.

    Create complex scheduling and prioritization of backup jobs. Set SLA baselines by defining typical job runtimes.

    Improve end user productivity with Granular Recovery Extensions (GRE) for VMware vSphere, Microsoft Exchange and SharePoint, by allowing application administrators to recover single items directly from the administrative console.

    Get active alerting and event management for backup and recovery operations with Data Protector management extensions such as Microsoft SCOM.

    Create complex scheduling and prioritization of backup jobs. Set SLA baselines by defining typical job runtimes.

    Transforming Data Protection with HPE

    More organizations are moving away from fragmented point-based solutions to HPE’s unified data protection solution. Data Protector, through integrations with storage systems such as StoreOnce and 3PAR arrays, can provide you with advanced capabilities that accelerate the backup process in a cost-efficient, scalable way to provide the best reliability, lowest management complexity and highest level of innovation.

    Adaptive approach to backup and recovery

    Centrally managed and standardized backup and disaster recovery that extends from the core data center to the edge of your business

    Actionable insight and awareness for rapid root-cause analysis, bridging data protection gaps, and planning future data protection needs


    Posted In: NEWS

    Tags: , , , , , , , , , ,

    Leave a Comment

    Using the Microsoft SQL Server 2012 Best Practice Analyzer #sql #server


    Using the Microsoft SQL Server 2012 Best Practice Analyzer

    By: Ashish Kumar Mehta | Read Comments (4) | Related Tips: More > SQL Server Configurations

    Database Administrators are often asked questions like “are all the SQL Servers within the organization configured according to industry standards?” In this tip, you will see how a Database Administrator can quickly use Microsoft SQL Server 2012 Best Practices Analyzer to analyze a SQL Server instance to determine whether it is configured according to best practices or not.


    The Microsoft SQL Server 2012 Best Practices Analyzer (BPA) is an excellent tool which is available as a free download from Microsoft. Using the BPA tool, a DBA can quickly gather information from Microsoft Windows and SQL Server configuration settings. The BPA tool basically uses a predefined set of SQL Server 2012 recommendations and best practices to identify potential issues within the database environment. You can download the latest version of Microsoft SQL Server 2012 Best Practices Analyzer (BPA) for free from the following link .

    The following are required for using SQL Server 2012 Best Practices Analyzer:

    Prior to the installation of Microsoft SQL Server 2012 Best Practice Analyzer you need to download and install Microsoft Baseline Configuration Analyzer 2.0 otherwise you will see the below screen when you double click SQL2012BPA_Setup64.MSI or SQL2012BPA_Setup32.MSI based on your environment when trying to install Microsoft SQL Server 2012 Best Practice Analyzer.

    Once you have successfully installed Microsoft Baseline Configuration Analyzer 2.0 and Microsoft SQL Server 2012 Best Practice Analyzer. You can use the BPA tool by clicking Start Programs Microsoft Baseline Configuration Analyzer 2.0. In Microsoft Baseline Configuration Analyzer 2.0, select a product such as SQL Server 2012 BPA as shown in the below snippet.

    Next, click Connect to Another Computer and choose Local Computer and then click OK to close the “Connect to Another Computer” window. Then in the Microsoft Baseline Configuration Analyzer 2.0 window, click Start Scan to begin the scan.

    In the Microsoft Baseline Configuration Analyzer 2.0 Enter Parameters screen, you need to specify the SQL Server Instance Name, for a default instance you need to specify the instance name as MSSQLSERVER and for a Named Instance you need to specify the SQL Server Instance Name as shown in the below snippet.

    Using Microsoft SQL Server 2012 Best Practice Analyzer you can scan the SQL Server Database Engine, Analysis Services, Replication, Integration Services, Reporting Servers and SQL Server Setup. You can choose the required parameters and click the Start Scan at the bottom of the screen to begin the scan.

    In the below snippet you can see the Microsoft Baseline Configuration Analyzer 2.0 is scanning to identify potential issues.

    Once the Microsoft SQL Server 2012 BPA 1.0 has completed the scanning you will be able to see the Baseline Configuration Analyzer Report which will be categorized into Errors and Warnings as shown in the snippet below.

    Once you expand the Error category as shown in the below snippet, you will be able to see different errors that exist on the SQL Server Instance as per the rules configured in the Best Practices Analyzer.

    You can click on the Error to view the Detailed Explanation of the issue encountered and to resolve the issue follow the resolution steps mentioned.

    Advantages of Using SQL Server 2012 Best Practice Analyzer

    • Using this tool a DBA can determine whether the SQL Server is configured as per the best practices recommended by the SQL Server Products Team.
    • This tool validates your instance of SQL Server with certain built-in rules that can help you rectify common installation and configuration issues.
    • This tool recommends fixes for most of the potential problems on an instance of SQL Server and it’s an excellent free tool to identify potential bottlenecks within your SQL Server Instance.

    Disadvantages of Using SQL Server 2012 Best Practice Analyzer

    • I feel some of the resolution messages are not very clear and if you have any doubts I would recommend you to click the “More Information” option under each Error or Warning to get a complete understanding before implementing the suggestion.
    • This tool is a great starting point to identify issues, but this tools does not address every potential issue for a SQL Server instance. You still need to educate yourself on best practices for setting up SQL Server.

    Troubleshooting Issues with SQL Server 2012 Best Practice Analyzer Installation

    • As a best practice, a person who is installing SQL Server 2012 Best Practice Analyzer must have their Windows Account within the Windows Local Administrator Group and SQL Server Administrator Group on the server where this tool is installed.
    • If you receive any issues related to PowerShell, it is recommended to take a look at the Additional Information section of the Microsoft SQL Server 2012 Best Practice Analyzer Download Page.

    Next Steps

    Last Update: 2012-06-22


    Posted In: NEWS

    Tags: , , , ,

    Leave a Comment

    Why Windows Server Backup Keep Only One Copy? #windows #server #backup


    Why Windows Server Backup Keep Only One Copy?

    Does Windows Server Backup keep only one copy?

    Many users have created backups using Windows Server Backup. Some users assigned a drive letter to the backup drive, and then check the image backup files. To their surprise, there was only one virtual hard disk (VHD) file in the backup location even after running scheduled backups several times. To figure out why this happened, you must understand how Windows Server Backup works. Actually, there no such thing about Windows Server Backup number of copies to keep. If you run a backup, Windows Server Backup will create a VHD file. Then Windows will create a shadow copy of target backup drive where stores the image backup. If you keep running scheduled backups, a lot of shadow copies will be taken. Though there may be one VHD file or VHDX file (in Winodws Server 2012 or latter), you can still restore to previous versions using these shadow copies.

    How to track the missing VHD?

    To better understand where the VHD file goes, you can actually track the windows server backup multiple copies. To list all the shadow copies, you can run a command “ssadmin list shadows” in an elevated command prompt.

    Based on the “Shadow Copy ID” shown in the output, you can find out the backup that is associated with the shadow copy by getting the snapshot ID. To reveal the snapshot of the backups, you can run the command “wbadmin get versions”.

    If you are still not sure about this, you can assign a drive letter to the shadow copies. and then you can view the contents in File Explorer by using the following command.

    Diskshadow shadowcopyID driveLetter

    For example, to assign drive letter H: to shadow copy . However, if you need more easier ways to handle multiple backup versions, you can switch to third party server backup software that does not have a limit number of copies.

    How to keep multiple backup copies?

    The server backup and recovery program, AOMEI Backupper Server allows you to save many versions of backup files that can be easily handled in File Explorer. It supports daily/weekly/monthly schedule backup, and you can choose to perform it with full, incremental/differential backup. What’s more, it provides the Disk Space Management feature to automatically clean up the previous obsolete backups.

    To backup Windows Server with AOMEI Backupper:

    Step1. Download free trial and run this backup tool for Windows Server. Within the intuitive interface, click “System Backup” under the Backup tab.

    Step2. Doing System Backup, the program will select all the needed items for system restore automatically by default. Therefore, just click “Step 2” to select the destination location.

    Step3. After specifying the destination path, click the “Schedule” button down there, and determine when you want the backup to run in the pop out window.

    Step 4. After all settings are done, click “Start Backup” to get the backup running.

    Since many people have wondered Why Windows Server Backup keep only one copy, AOMEI Backupper comes in to play and made the versions of backups more easier to understand, which also gives conveninence regarding of Windows Server restore.


    Posted In: NEWS

    Tags: , , , , , , , , , , , ,

    Leave a Comment

    What is Hosting Server? Webopedia Definition #hosting #server, #definition, #define, #what


    hosting server

    Related Terms

    A server dedicated to hosting a service or services for users. Hosting servers are most often used for hosting Web sites but can also be used for hosting files, images, games and similar content. Hosting servers can be shared among many clients (shared hosting servers) or dedicated to a single client (dedicated servers), the latter of which is particularly common for larger Web sites where the hosting needs of the Web site owner necessitate more control and/or bandwidth.

    See also “All About Web Site Hosting” in theDid You Know. sectionof Webopedia.

    hosted VoIP

    hosting services

    Related Links



    Stay up to date on the latest developments in Internet terminology with a free weekly newsletter from Webopedia. Join to subscribe now.

    List of free online Java courses for students and IT professionals looking to enhance their skills. Read More

    From keyword analysis to backlinks and Google search engine algorithm updates, our search engine optimization glossary lists 85 SEO terms you need. Read More

    Microsoft Windows is a family of operating systems for personal computers. In this article we look at the history of Microsoft operating. Read More

    Java is a high-level programming language. This guide describes the basics of Java, providing an overview of syntax, variables, data types and. Read More

    This second Study Guide describes the basics of Java, providing an overview of operators, modifiers and control Structures. Read More

    The Open System Interconnection (OSI) model defines a networking framework to implement protocols in seven layers. Use this handy guide to compare. Read More


    Posted In: NEWS

    Tags: , , , , , , , , , , , , ,

    Leave a Comment

    Virtual Receive-side Scaling in Windows Server 2012 R2 #virtual #private #server


    Virtual Receive-side Scaling in Windows Server 2012 R2

    Anthony, a network administrator, is setting up a new host for his workplace with 2 NICs that are SR-IOV capable. He is going to use Windows Server 2012 R2 to host a file server system that his workplace is purchasing and implementing. After the hardware and software are installed, Anthony configures his VM to use 8 virtual processors and 4096 MB of memory. Unfortunately, Anthony does not have the option of turning on SR-IOV because his VMs rely on policy enforcement through the vmSwitch. Initially, Anthony assigns 4 virtual processors through PowerShell to be available for use with vRSS. After a week, the service has become extremely popular and Anthony checks the performance. He discovers that all four virtual processors are fully utilized. Anthony changes the RSS processor and assigns two more virtual processors to be available to RSS in the virtual machine to help handle the large network load.

    Sandra, a network administrator, is setting up a single large virtual machine on one of her systems for the sole purpose of being a software load balancer. Sandra has just installed Windows Server 2012 R2 so that she can expand VMQ to use multiple processors per vmNIC. Since Sandra only has one vmNIC in this system she decides she will turn on RSS in the VM to get spreading for processing through the vmSwitch. Since vRSS is disabled by default, Sandra enables RSS using PowerShell cmdlets.

    There are 4 steps you should take before implementing Virtual Receive-side scaling:

    Verify that the network adapter is VMQ capable and has a link speed of = 10G. If the link speed is less than 10G, VMQ is disabled by default by the vmSwitch even though it will still show as enabled in the PowerShell cmdlet, Get-NetAdapterVmq. One way to verify that VMQ is disabled is to use the cmdlet, Get-NetAdapterVmqQueue. This will show that there is not a QueueID assigned to the VM or host vNIC

    Verify that VMQ is enabled on the host machine. Virtual Receive-side scaling will not work if the host does not support VMQ. You can check this by running Get-VMSwitch and finding the adapter that the vmSwitch is using. Next, run Get-NetAdapterVmq and ensure that the adapter is shown in the results and has VMQ enabled.

    Verify that an SRIOV Virtual Function (VF) driver is not attached to the VM network interface. This can be done using the Get-NetAdapterSriov cmdlet. If a VF driver is loaded, Receive-side scaling will use the scaling settings from this driver instead of those configured by Virtual Receive-side scaling. If the VF driver does not support Receive-side scaling then Virtual Receive-side scaling is disabled.

    If you are using LBFO (NIC Teaming), it is essential that VMQ be properly configured to work with the LBFO settings. For detailed information about LBFO deployment and management, see NIC Teaming Configuration and Management .

    Virtual Receive-side scaling can be enabled using PowerShell or with Device Manager:

    To enable Virtual Receive-side scaling using Device Manager

    On the virtual machine, open Device Manager (In Settings click Control Panel. and then click Device Manager ).

    Expand Network adapters. right-click the network adapter you want to work with, and then click Properties .

    On the Advanced tab in the network adapter properties, locate the setting for Receive-side scaling and make sure it is enabled.

    Virtual Receive-side scaling settings in the virtual machine are configured using Set-NetAdapterRss. which is the same command used for native RSS. You can view and configure the settings of the virtual machine by using the cmdlet, Get-NetAdapterRSS. A portion of the PowerShell help for the cmdlet is below, and a full description of the cmdlets that lists all the configurable settings and expected types can be found on TechNet at Set-NetAdapterRss .

    Setting the profile inside the virtual machine will not impact the scheduling of the work. The hypervisor makes all the scheduling decisions and ignores the profile inside the virtual machine.

    Virtual Receive-side scaling is on, but how do I know if it is working?

    You’ll be able to tell Virtual Receive-side scaling is working by opening the task manager in your virtual machine and viewing the virtual processor utilization. If there are multiple connections established to the virtual machine, you will see more than one core above 0% utilization.

    The same holds true for the host. Virtual Receive-side scaling made changes to the VMQ algorithm to expand virtual machine processing to more than 1 core for vmSwitch processing. If a single core reaches 80% utilization, you should see the traffic start to expand to multiple cores.

    Are there any perfmon counters I can check?

    In Perfmon. there are three counters that can help you evaluate Virtual Receive-side scaling. Once in Perfmon. right click on the graph and click Add Counters …. Click the Hyper-V Virtual Switch Processor category and the counters are under:

    Number of VMQs – The number of VMQ processors on affinitized to that processor

    Packets from External – Packets indicated to a processor from any external NIC

    Packets from Internal – Packets indicated to a processor from any internal interfaces, ie vNIC (Host NIC), vmNIC.

    I’m looking at the host and not all of the processors are being used. It looks like every other one is being skipped.

    Check to see if hyper threading is enabled. VMQ and Virtual Receive-side scaling are both designed to skip hyper threaded cores.


    Posted In: NEWS

    Tags: , , ,

    Leave a Comment

    Novosoft Office Backup – Data Backup and Disaster Recovery Software for


    Novosoft Office Backup – Data Backup Software

    Office Backup is a backup software designed for automatic backing up data of PCs running Windows operating systems. It allows you to protect important data from loss or damage by regularly creating backups and storing them to secured storage media or servers. And then, if something happens to your PC, you will be able to restore all information in a matter of minutes.

    With Office Backup you can perform both files-based and image-based backup of your HDD. The program has a simple Wizard-driven interface and can be easily utilized by users of all levels of computer expertise. It provides an easy way to enable backup compression, encryption, and to schedule tasks to take place automatically. Supported OS include Windows 7/Vista/XP/2000 and Windows Server 2008/2003/2000 .

    NEW! Grand premiere! Novosoft Office Backup 4 is officially released! The latest version of the popular office backup utility features the handy expansion of cloud backup features (explore backup to Amazon S3 and manual-registration-free Cloud Storage by Novosoft) and the hottest advanced backup options (MSSQL backup, Outlook hot backup, support of Volume Shadow Copy for MS Exchange backup, and more).

    For details, feel warmly welcome to check the News .

    In addition to FTPS and DiscHub, Office Backup supports backing up to the following media and servers:

    Since some features of the utility are aimed at experienced IT users, we have decided to divide its functions into two separate editions: one tailored for needs of Home Users and another for IT Professionals .

    Office Backup Home –
    Solution for Home users

    Office Backup Professional –
    Solution for IT Professionals


    Posted In: NEWS

    Tags: , , , , , , , , , , , , , , , , , , , ,

    Leave a Comment