IT Solutions for Small Business

Our goal is to make your company more profitable (however you measure profitability), by providing cost-effective IT solutions to many of your organization's fundamental challenges. Allora partners provide technical solutions for small to mid-sized businesses of virtually every type: from office environments, to manufacturing facilities, to retail storefronts. Our IT solutions are refined by many years of practical application in the real world; they are flexible, yet reliable.

The positive effect of computer networking on employee productivity, critical data retention, and overall savings on IT budget, are well established throughout industry. Allora builds reliable computer networks, based on any combination of Windows, Macintosh, and Unix/Linux devices.

Email is also an indispensable business technology, both within an organization, and as an economical and effective method of reaching customers. Websites, if nothing else, are cheap advertising; but a well-designed web application can qualitatively change your company's operations. Allora provides a full-featured set of hosting solutions for email, websites and webapps.

Allora can also integrate your telephone system with your network or web presence, either to inform you about your callers, or simply to save money on long distance. We also provide cloud IT solutions via hosted Small Business Servers.


As part of our company philosophy we encourage our customers to learn more about IT infrastructure by reading the following informative articles - saving you time and money:

How anonymous surfing works

Anonymizers provide means to surf the net anonymously, to get around web filters and local legislative restrictions on the websites you visit. This article talks about some types of anonymizers and how they are used. The most popular anonymizers will be discussed in detail.

Recently, the Web has become a major part of the Internet, and a browser is open a user’s computer most of the time. Services that originally worked through special programs and protocols are now presented in web versions: they are e-mail, messaging systems, conferences, video chats, and much, much more. At the same time, users are facing with problems of free access to websites. There are many restrictions: employers block access to sites from users' computers, providers and communication operators ban access to web resources by the order of state bodies, website owners and hosts block access for certain countries and regions and so on. All this limits the free use of web resources for ordinary users.

There is another problem. Websites collect a lot of information about the visitors and often registration is required, which includes personal data transfer. Websites install special cookies to track your movements around the site, advertising networks monitor all actions, social networks transmit all information they are entrusted with to third parties such as advertisers and state bodies. Under the pressure of total control over the World Wide Web, ordinary users increasingly wish to keep their privacy and anonymity.

All these problems have one solution which is anonymizers. These are special services or applications that redirect the user's web traffic through their servers. They provide anonymity by hiding the real IP-address of the user and removing special cookies. They also allow bypassing various filters, both regional and installed by your local provider.

Let’s find out how we usually get access to a websites. The user enters the website address in a web browser, it generates an HTTP request and sends it to the website server. On the route to the target web server, user’s traffic can pass one or more filters. It can be a transparent proxy server from the employer designed to monitor the use of the Internet at the workplace, a filter from the provider, designed to enforce federal legislation by blocking access to prohibited sites, and a filter from a telecom operator that performs similar functions. As a result, the request can be rejected and you will be redirected to a dead-end page informing you that the access is prohibited. In this case, the target web server will not even receive your request.

Anonymizer always breaks a direct chain of communication and becomes an intermediary between your web browser and the necessary web server. Your request is sent to the anonymizer server, and since it is not in the banned list and the blocking is not done from its side, the request successfully gets to its destination. In its turn, the anonymizer server creates a new request with your data to the target web server and the response is redirected to your computer.

Types of Anonymizers

There are lots of various types of anonymizers, differing in the technologies used and ways to unblock websites. In addition, anonymizers can be free, shareware, with advertisements on visited websites, with limited traffic, and fully paid. First, we define the types of anonymizers:

Web Anonymizers run in the form of websites and let the user operate without the need to install additional software. All you need is to just go to the web anonymizer, enter the address of the website you want to access, and all the content of this website will be accessible to you. This type of anonymizers has a number of technical limitations. Many complex sites cannot be redirected this way because of the large number of complex links, JavaScript scripts, the use of AJAX technology, and so on. Therefore, a lot of interactive websites will not function properly. Besides, web anonymizers usually do not cope with redirecting multimedia information such as music, videos and other. Web anonymizers are ideal for a quick access to simple websites.

Proxy server technology is actively used in spheres other than anonymizers. However, the proxy server is also ideal for performing these functions. The benefit of this technology is that you do not need to install additional programs. It’s enough to just specify the address of the proxy server in the browser settings and the ability to work with any content. The proxy server performs full translation of all information, so it can work with any interactive websites and all types of multimedia. However, there are disadvantages. The proxy server will work for all websites at once, and not just for the required ones. It cannot work in a chain, which means that if a proxy is already utilized on your network to access the Internet, this method cannot be used.

VPN technology, like proxy server technology, is actively used for a different task. It provides remote access to the internal network. However, VPN technologies are also successfully used for anonymization. The advantages of VPN include the ability to completely redirect traffic, not only web, but any other services. This technology also has its disadvantages such as the need to configure VPN access or install additional programs and the inability to operate through a corporate proxy server.

Browser Extensions are anonymizers that requires installation, after which they can be activated on certain websites and perform traffic redirection through its server. The quality of such anonymizer directly depends on its developer and server operation. Some of them function correctly with any sites and media content like ZenMate, while others produce results worse than web anonymizers. In the same way, they work with corporate proxy servers. Some extensions work through them without problems, while others are not fully functional.

Special browser is a type of anonymization significantly different from the rest and is a special assembly of popular browsers (usually Chromium or Firefox). They have built-in anonymization tools that work through simple extensions described in the previous paragraph, or through anonymous networks, such as TOR. This type of anonymization does not require additional settings. However, you need to install a special browser, which complicates the user's work. You have to remember which websites are better viewed in a special browser, and use two similar programs at the same time.

Almost all these types of anonymizers can be both paid and free, but there is a general tendency. The first type is usually conditionally free and displays ads on the websites you visit, though most of them do not have a paid subscription. The third type is almost always paid, but sometimes services offer traffic-limited trial accounts. The last fifth type is represented only by free solutions.
One of the most unpleasant and fairly frequent problems for a PC user is losing data for one of these reasons:
  • accidental deletion
  • a glitch of an Operating System
  • virus infection (cryptolocker )
  • hard drive error
Recoverrey of deleted files or damaged files can be done through several routes:

Most natural and obvious path: recovery from a Backup

The only drawback here is a potential time gap between the last backup and the last modification of a deleted file in need of recovery. Quite often recent modifications are lost thus a user must repeat a certain amount of work. Even if it's an hour this can be very frustrating.

Windows’ previous versions tool.

This utility first saw life with Windows Vista release. It's not always engaged out of the box though. In order to ensure that it's active a user should go through a few simple steps. Here's a screen-cast for Windows 7. Other operating system work similarly. Please note that this functionality does consume a certain portion of your hard drive for storing shadow copies of your data. Unlike Backup there's a much better chance of catching the latest version of a missing file.

Cloud synchronization of data folders

If you're working online and your folder with a missing file synced with a cloud such as Google Drive or Dropbox then it's just a matter of logging to this cloud via a web-browser and restoring it through a dedicated interface ("Manage versions" under G-Drive for instance). There's a slim chance that the latest modification didn't complete its synchronization prior to deletion because of slow bandwidth and/or large file size. 

Recovery of deleted files with 3rd party utilities

Here we'd discuss a scenario where no pro-active steps mentioned above were taken or it wouldn't deliver the latest version of a deleted file.

Thanks to the very structure of a hard drive and its file system it is possible to restore a freshly deleted file. Each file consists of a set of zero's and one's which are written in certain mapped areas of a hard drive. When you delete a file the system marks the file as deleted in its catalog but the data itself is left intact until another file gets an assignment to that space and its data overwrites the previous zero's and one's. This is very important to understand because each event of writing data increases the risk of overwriting the data that belonged to an accidentally deleted file. Ideally no further activity should occur on the drive with a missing file. This is not very difficult at all if you have multiple partitions on the drive and the deleted file is NOT on the partition where OS resides. If it's not the case it's best to shutdown your computer immediately. A hard shutdown would further improve the odds of keeping your data intact however it could easily damage the Operating System itself and therefore we can't advise this route. From here there are two options: extracting the hard drive and plugging it to another computer for recovery of deleted files or loading a special recovery OS from a USB stick or CD.

There are many programs that would handle this job, here's to name eighteen: We'd focus on Recuva by Pirisoft as the most user-friendly. We'd also mention @Active products by L-Soft that provides a variety of tools for backup and data recovery, including much more serious scenarios such as partition damage.

Once you load Recuva it will offer a list of file types for recovery: video, images, music, archives or all files. If you remember the exact folder where the missing file resided it'd expedite the process considerably otherwise a scanning process could take hours depending on the drive size and the amount of files. Upon completion of scanning Recuva would display a list of deleted files and chances of recovery. If your file is marked as green then the file's data isn't damaged at all and its full recovery is guaranteed. Orange marks partially damaged data and Red is really bad news. At this stage the only recovery option left is sending the hard drive to a special center however the odds are still very miniscule as discussed here. Solid State drives scenarios are even more complex.
VPNs are becoming an integral part of operating almost every business in every industry in today’s fast-paced telecommuter world.  Whether the need is to connect all your branches to the home office’s resources or allowing your salesforce and project managers to access company resources remotely or simply surf anonymously, a VPN is the way to keep everyone connected to your centralized resources.  So, what is a VPN?  How to set up a VPN? How do you choose the right configuration?  What options are available and why would one be better than another?  Read on to find out!

At its core a VPN is a tunnel through the internet from point A to point B that shields the data being sent and received from public access and scrutiny.  There are two basic types of VPNs, a Site-to-Site VPN and a Client/Server VPN.  A Site-to-Site VPN connects two or more separate physical locations, such as branches of a bank or retail chain, to the main internal network of a company such as the internal network at the headquarters of a company.  This allows everyone in a branch, while in the office, to access company resources that are housed at the headquarters as if they were physically at the headquarters location.  There are clear benefits to leveraging a VPN for this purpose.  Keeping data stored on centralized servers and allowing access through folder shares over a VPN allows for a more streamlined security system that is easier to manage, and backups are guaranteed to include the most important company data since the data is centralized on servers at the main office.  The drawback to a Site-to-Site VPN is that it only allows access to centralized company resources if the employees at the remote location are in the office.  Employees that travel constantly or work from home would not have access to the VPN-available resources.  A Client/Server VPN addresses the issue of traveling and remote individual workers.  This type of VPN allows individual users to connect to centralized company resources no matter where they are; all they need is an internet connection and their laptop or mobile device to be configured for the VPN.

Whether a VPN is configured for Site-to-Site and/or Client/Server functionality, there are a number of protocols to choose from.  These options include Internet Protocol Security (IPSec), Layer 2 Transport Protocol (L2TP), Point-to-Point Tunneling Protocol (PPTP), Secure Socket Transport Protocol (SSTP), and OpenVPN.  Off the bat, PPTP has been proven to be easily hacked and should NOT be used if at all possible.  IPSec uses existing internet protocols to establish a secure connection at both ends of the tunnel and encrypt the traffic.  L2TP creates a tunnel, but is usually combined with other protocols such as IPSec to secure the traffic over the tunnel.  SSTP utilizes Secure Sockets to establish the tunnel, and either SSL or TLS to secure the tunnel.  This is a newer option for Windows platforms only, and is usually preferred for its use of SSL/TLS certificates and the fact that port 443 is always open, so no additional ports need to be opened on a firewall to allow the VPN traffic through.  OpenVPN is an open-source design for establishing a free VPN server that uses SSL to secure the traffic.  This is done with SSL certificates that are either generated by an in-house CA server or with OpenSSL, which is included in the OpenVPN install.  Additionally, passwords can be set on top of the certificates to add an additional layer of authentication and security.

There are many hardware and/or software options available these days , ranging from free VPN server software such as OpenVPN to elaborate hardware/software systems such as Cisco’s various solutions.  So, why choose one over the others  (what's the best free VPN?) and what’s involved in making that choice?  First, decide if a Site-to-Site or Client/Server VPN is what’s needed, or both.  Open-source and commercial solutions usually support both types of VPNs, and both have pitfalls in the learning curve, but look for one in each category that offers what your situation requires.  Being open-source, OpenVPN is a free VPN server, but you’re pretty much on your own if you need any help seeking your answer to "How to set up VPN?".  Cisco requires investing in their hardware and software as well as client licenses for most devices to access the VPN.  The benefits to commercial solutions is the available tech support and the unified design which can make implementation and management feel easier, even if it’s not.

If you choose to go with a commercial solution then you are done with the initial decisions, and should next start learning that particular platform in preparation for the design and implementation stages.  Choosing to go with the open-source OpenVPN is not the end of the decision making. OpenVPNOpenVPN has been put out as a stand-alone software package that will run both ends of a VPN, and is compatible with many major platforms making it a great choice for homogenized and hybrid environments alike.  In addition, OpenVPN has been integrated into other software packages such as pfSense, Untangle, and IPFire, as well as hardware such as Netgate’s pfSense appliances and Ubiquity’s EdgeMAX products.  We’ve not had any experience with Untangle or IPFire, but on paper both look similar to pfSense.  We found Ubiquiti’s EdgeMAX products to be very difficult and slow to configure.  Additionally, appliances tend to have less power under the hood, meaning this could get expensive for environments with high traffic volumes.  However, pfSense differs in that it offers an almost all-inclusive package for implementing and managing a network, including OpenVPN, and is much easier to set up than Ubiquiti’s or Cisco’s equipment.  What makes it even better is that you don’t even have to buy Netgate’s pfSense appliances. You can round up a desktop computer, apply the pre-made pfSense image, and have far more processing power than most appliances and at a fraction of the cost.  If you’ve got an old desktop sitting in the corner it’s probably just right for the job, or for smaller jobs a Raspberry Pi can be had for around $50.  Just as commercial products are designed, pfSense is also scalable to an enterprise level making it a cost-effective and viable option for SMBs and large enterprises alike.  And with features like Active Directory integration and addons like Snort for intrusion detection and real-time traffic monitoring, pfSense is again a serious contender against commercial products like Cisco or Palo Alto’s monitored firewall services.

All-in-all, the choice of a free VPN server or a commercial system will come down to your budget, your need for 24/7 phone support, and in some cases vendor-restriction requirements.  If you’re in an environment that only accepts commercial products, then feel free to propose an open-source alternative but expect to be told: “No.”  For those that don’t have such restrictions, either leveraging OpenVPN on its own or integrated into pfSense is really worth serious consideration, no matter how small or large your environment is now or grows to be in the future.  And with pfSense being maintained and updated by a for-profit company, even the free versions are benefiting from more stable releases and timely patches that help keep your network safe as the years go by, and the user interface gets new features through version updates that streamline the management of your network.

Today we are reviewing Group Policy in Windows.  One thing to keep in mind is that if you are on a computer that is connected to a domain, then you need to be aware of how both Local and Domain settings are configured.  Configuring settings in both can have unintended consequences or conflicts, and so it’s best to manage as much from the domain level as possible.

Since we are focused on business environments with domains, we will be working with Group Policy on a test server with Active Directory installed and configured already.  To open the Group Policy Manager you can either select it from the Tools menu in the Server Manager, or by going to the Administrative Tools in the control panel.  For reference, if you need to access a system’s Local Policy you should open the Run command and type gpedit.msc

The first thing you might have noticed is that there are a number of objects created in addition to the defaults.  This is the best way to handle the organization of Group Policy.  If you just go in and edit the default template then you can end up forgetting where a particular setting was configured, and this can also complicate applying policies to multiple objects in different ways.  For example, if you want all domain admins to have specific drives mapped that users don’t have access to, then create separate objects for each and configure the settings accordingly.  Then you can assign the policies separately and very easily.  If you have remote access configured in some manner, don’t include this in the default or mapped drive configurations.  Create a separate object for that and apply it as needed.  This way you can easily locate specific settings any time you need to manage your domain.

So, let’s look at a few settings that should ALWAYS be configured as follows.  First, let’s create a Group Policy Object and call it “Global-Security.”  Make sure to link it properly for your configuration, but in a simple environment the default setting is best. Now right-click the object and choose Edit to open the Group Policy editor.

Drill down to the User Rights Assignment section, and you will see a setting called “Allow log on locally.”  Double-click it to open the editor, select “Enabled,” and add the built-in Administrators group and click “OK.”  Next, select “Deny log on locally” and add the Guest account and the Guests group.  This configuration adds a little extra security to your domain because you are explicitly excluding the Guest account’s log in permissions.

Next, choose Security Options on the left.  Look for the Administrator account status, double-click to open the editor, and set it to “Disabled.”  You should already have another account that is designated with the necessary privileges to manage your domains and servers, even if they aren’t the same account.  You should also first verify that none of the services running are using the Administrator account.  Locking down this account is a standard requirement for any level of security.  Next, do the same thing for the Guest account setting.

The last setting we’re going to cover is the UNC Hardened Access setting.  Microsoft released bulletin MS15-011 in February 2015 with instructions for configuring this policy that includes an explanation of the issue.  You can Google the bulletin number if you’re interested in learning more about it.  If you have a complex environment you should read the bulletin before doing this so that you understand the settings involved, but if you only have one server with a handful of workstations the following settings are the best choice.

Drill down on the left to the Network Provider section, and you will see the Hardened UNC Paths setting.  Double-click this to open the editor, and click Enable.  Scroll down so that you can see the Show button.  Click the Show button and enter in the Value Name field:

\\your domain name\*

In the Value field you need to enter three settings separated by a comma as follows:




Click OK twice to save the settings. 

The last thing we need to do is ensure that our settings won’t be ignored in the event of a conflict with other policies.  Close the Group Policy editor so that you are back at the Group Policy Manager.  Right-click the object you just configured and choose Enforce.  This will prevent any other policies that you configure from overwriting these settings during the boot and logon processes.

Thank you for watching!  I hope this helps you better understand how to interact with Group Policy, and now that you’re familiar with this be sure to look through all the settings you can set.  Knowing how to use Group Policy properly will help you better manage your networks and keep them more secure.
Employees come, and employees go.  Turnover is a natural part of the business life-cycle.  But, choosing to let an employee go isn’t usually an easy decision, and the actual process of doing so is complicated.  There is paperwork to be done, taxes to be calculated and paid, company property may need to be returned, and the company’s security to preserve.  And with the onset of our digital world, there are a myriad of things that require your IT vendor(s) or staff to be focused on as part of this process.  This all adds up to a coordinated effort on the part of multiple people in multiple departments just to execute one decision made by one person.

employee terminationFirst and foremost, this means that the IT department needs to be notified when an employee is terminated.  The situation will dictate when this notification is appropriate; sometimes it may be better to have the IT department on standby for further instructions while other times notification won’t be needed until after the employee is notified.  Either way, the IT department MUST be a part of the process to help preserve and protect company property and infrastructure.  When your IT department isn’t included you can end up with security holes in your infrastructure that could be accessed by a disgruntled employee, or you could lose valuable information if an employee deletes data from a computer or data store.

So, what does your IT department really need to be doing about an employee termination?  First, all access an employee had to company resources should be locked down.  This can be done by changing passwords or disabling user accounts, whichever is appropriate for the situation.  Second, preserving information should be paramount.  Computers may need to be imaged with a forensic tool such as FTK Imager (a free tool from AccessData), email accounts need to be backed up or archived in Exchange, and any cloud storage accounts need to be reviewed.  That last point may require gaining access to the account(s), which can be easy or difficult depending on how the employee set things up.  And, backing up computers may mean getting them back from the employee, which isn’t always easy if the situation is tense.  Third, email accounts need to be given to someone else in the company that can take over where the former employee left off.  This can be done by simply giving the current employee a way to access the email or an automated response can be set directing everyone to the new person.  In some cases, it may be best to delete the email account and create an alias under the new person, but only after backing up the old account first.  And last, the company needs to take steps to delegate responsibilities properly.  If this isn’t done sales could be lost or meetings could be missed, which leaves the company looking bad or worse.  This may not require the IT department to facilitate, but sometimes technical help is needed to get those in the company that pick up the responsibilities access to what they need out of the old employee’s files or accounts.

This is starting to sound like an awful lot of effort, but some steps may not be necessary every time.  For instance, forensic images take a long time to make.  It’s not something that requires a person’s full attention the entire time, but if there is any question about illegal activity or a future lawsuit over proper compensation and wages then preserving evidence will go a long way to helping the company through the ordeal.  However, this cannot be done after-the-fact, it can only be done at the time the employee first leaves.  Once someone else is using that computer the evidence is tainted and won’t carry as much weight, if any, during litigation.  But, if there isn’t any concern about future litigation or previous illegal activity then a forensic image may not be necessary.  And, most companies these days do a decent job of limiting their employees’ use of unapproved products and services, so things like random cloud accounts aren’t usually a big issue.

Ultimately it’s up to the person or people in charge to decide how to handle a given situation, but every company should have a standard set for how to deal with employee terminations, and in the world we live in this must include the IT department in the process.  Whatever procedures are decided upon should be well documented, and every manager or person with authority should know them and be able to reference a single standard template.  Without good policies and procedures in place that include the IT department a company could end up losing face in the eyes of their customers and partners or leaving itself open to an extensive amount of damage.
© 2018 - Allora Consulting