IT Solutions for Small Business

Our goal is to make your company more profitable (however you measure profitability), by providing cost-effective IT solutions to many of your organization's fundamental challenges. Allora partners provide technical solutions for small to mid-sized businesses of virtually every type: from office environments, to manufacturing facilities, to retail storefronts. Our IT solutions are refined by many years of practical application in the real world; they are flexible, yet reliable.


The positive effect of computer networking on employee productivity, critical data retention, and overall savings on IT budget, are well established throughout industry. Allora builds reliable computer networks, based on any combination of Windows, Macintosh, and Unix/Linux devices.


Email is also an indispensable business technology, both within an organization, and as an economical and effective method of reaching customers. Websites, if nothing else, are cheap advertising; but a well-designed web application can qualitatively change your company's operations. Allora provides a full-featured set of hosting solutions for email, websites and webapps.


Allora can also integrate your telephone system with your network or web presence, either to inform you about your callers, or simply to save money on long distance. We also provide cloud IT solutions via hosted Small Business Servers.

Articles

As part of our company philosophy we encourage our customers to learn more about IT infrastructure by reading the following informative articles - saving you time and money:

VPNs are becoming an integral part of operating almost every business in every industry in today’s fast-paced telecommuter world.  Whether the need is to connect all your branches to the home office’s resources or allowing your salesforce and project managers to access company resources remotely, a VPN is the way to keep everyone connected to your centralized resources.  So, what is a VPN?  How to set up a VPN? How do you choose the right configuration?  What options are available and why would one be better than another?  Read on to find out!

At its core a VPN is a tunnel through the internet from point A to point B that shields the data being sent and received from public access and scrutiny.  There are two basic types of VPNs, a Site-to-Site VPN and a Client/Server VPN.  A Site-to-Site VPN connects two or more separate physical locations, such as branches of a bank or retail chain, to the main internal network of a company such as the internal network at the headquarters of a company.  This allows everyone in a branch, while in the office, to access company resources that are housed at the headquarters as if they were physically at the headquarters location.  There are clear benefits to leveraging a VPN for this purpose.  Keeping data stored on centralized servers and allowing access through folder shares over a VPN allows for a more streamlined security system that is easier to manage, and backups are guaranteed to include the most important company data since the data is centralized on servers at the main office.  The drawback to a Site-to-Site VPN is that it only allows access to centralized company resources if the employees at the remote location are in the office.  Employees that travel constantly or work from home would not have access to the VPN-available resources.  A Client/Server VPN addresses the issue of traveling and remote individual workers.  This type of VPN allows individual users to connect to centralized company resources no matter where they are; all they need is an internet connection and their laptop or mobile device to be configured for the VPN.

Whether a VPN is configured for Site-to-Site and/or Client/Server functionality, there are a number of protocols to choose from.  These options include Internet Protocol Security (IPSec), Layer 2 Transport Protocol (L2TP), Point-to-Point Tunneling Protocol (PPTP), Secure Socket Transport Protocol (SSTP), and OpenVPN.  Off the bat, PPTP has been proven to be easily hacked and should NOT be used if at all possible.  IPSec uses existing internet protocols to establish a secure connection at both ends of the tunnel and encrypt the traffic.  L2TP creates a tunnel, but is usually combined with other protocols such as IPSec to secure the traffic over the tunnel.  SSTP utilizes Secure Sockets to establish the tunnel, and either SSL or TLS to secure the tunnel.  This is a newer option for Windows platforms only, and is usually preferred for its use of SSL/TLS certificates and the fact that port 443 is always open, so no additional ports need to be opened on a firewall to allow the VPN traffic through.  OpenVPN is an open-source design for establishing a free VPN server that uses SSL to secure the traffic.  This is done with SSL certificates that are either generated by an in-house CA server or with OpenSSL, which is included in the OpenVPN install.  Additionally, passwords can be set on top of the certificates to add an additional layer of authentication and security.

There are many hardware and/or software options available these days , ranging from free VPN server software such as OpenVPN to elaborate hardware/software systems such as Cisco’s various solutions.  So, why choose one over the others  (what's the best free VPN?) and what’s involved in making that choice?  First, decide if a Site-to-Site or Client/Server VPN is what’s needed, or both.  Open-source and commercial solutions usually support both types of VPNs, and both have pitfalls in the learning curve, but look for one in each category that offers what your situation requires.  Being open-source, OpenVPN is a free VPN server, but you’re pretty much on your own if you need any help seeking your answer to "How to set up VPN?".  Cisco requires investing in their hardware and software as well as client licenses for most devices to access the VPN.  The benefits to commercial solutions is the available tech support and the unified design which can make implementation and management feel easier, even if it’s not.

If you choose to go with a commercial solution then you are done with the initial decisions, and should next start learning that particular platform in preparation for the design and implementation stages.  Choosing to go with the open-source OpenVPN is not the end of the decision making. OpenVPNOpenVPN has been put out as a stand-alone software package that will run both ends of a VPN, and is compatible with many major platforms making it a great choice for homogenized and hybrid environments alike.  In addition, OpenVPN has been integrated into other software packages such as pfSense, Untangle, and IPFire, as well as hardware such as Netgate’s pfSense appliances and Ubiquity’s EdgeMAX products.  We’ve not had any experience with Untangle or IPFire, but on paper both look similar to pfSense.  We found Ubiquiti’s EdgeMAX products to be very difficult and slow to configure.  Additionally, appliances tend to have less power under the hood, meaning this could get expensive for environments with high traffic volumes.  However, pfSense differs in that it offers an almost all-inclusive package for implementing and managing a network, including OpenVPN, and is much easier to set up than Ubiquiti’s or Cisco’s equipment.  What makes it even better is that you don’t even have to buy Netgate’s pfSense appliances. You can round up a desktop computer, apply the pre-made pfSense image, and have far more processing power than most appliances and at a fraction of the cost.  If you’ve got an old desktop sitting in the corner it’s probably just right for the job, or for smaller jobs a Raspberry Pi can be had for around $50.  Just as commercial products are designed, pfSense is also scalable to an enterprise level making it a cost-effective and viable option for SMBs and large enterprises alike.  And with features like Active Directory integration and addons like Snort for intrusion detection and real-time traffic monitoring, pfSense is again a serious contender against commercial products like Cisco or Palo Alto’s monitored firewall services.

All-in-all, the choice of a free VPN server or a commercial system will come down to your budget, your need for 24/7 phone support, and in some cases vendor-restriction requirements.  If you’re in an environment that only accepts commercial products, then feel free to propose an open-source alternative but expect to be told: “No.”  For those that don’t have such restrictions, either leveraging OpenVPN on its own or integrated into pfSense is really worth serious consideration, no matter how small or large your environment is now or grows to be in the future.  And with pfSense being maintained and updated by a for-profit company, even the free versions are benefiting from more stable releases and timely patches that help keep your network safe as the years go by, and the user interface gets new features through version updates that streamline the management of your network.
Employees come, and employees go.  Turnover is a natural part of the business life-cycle.  But, choosing to let an employee go isn’t usually an easy decision, and the actual process of doing so is complicated.  There is paperwork to be done, taxes to be calculated and paid, company property may need to be returned, and the company’s security to preserve.  And with the onset of our digital world, there are a myriad of things that require your IT vendor(s) or staff to be focused on as part of this process.  This all adds up to a coordinated effort on the part of multiple people in multiple departments just to execute one decision made by one person.

employee terminationFirst and foremost, this means that the IT department needs to be notified when an employee is terminated.  The situation will dictate when this notification is appropriate; sometimes it may be better to have the IT department on standby for further instructions while other times notification won’t be needed until after the employee is notified.  Either way, the IT department MUST be a part of the process to help preserve and protect company property and infrastructure.  When your IT department isn’t included you can end up with security holes in your infrastructure that could be accessed by a disgruntled employee, or you could lose valuable information if an employee deletes data from a computer or data store.

So, what does your IT department really need to be doing about an employee termination?  First, all access an employee had to company resources should be locked down.  This can be done by changing passwords or disabling user accounts, whichever is appropriate for the situation.  Second, preserving information should be paramount.  Computers may need to be imaged with a forensic tool such as FTK Imager (a free tool from AccessData), email accounts need to be backed up or archived in Exchange, and any cloud storage accounts need to be reviewed.  That last point may require gaining access to the account(s), which can be easy or difficult depending on how the employee set things up.  And, backing up computers may mean getting them back from the employee, which isn’t always easy if the situation is tense.  Third, email accounts need to be given to someone else in the company that can take over where the former employee left off.  This can be done by simply giving the current employee a way to access the email or an automated response can be set directing everyone to the new person.  In some cases, it may be best to delete the email account and create an alias under the new person, but only after backing up the old account first.  And last, the company needs to take steps to delegate responsibilities properly.  If this isn’t done sales could be lost or meetings could be missed, which leaves the company looking bad or worse.  This may not require the IT department to facilitate, but sometimes technical help is needed to get those in the company that pick up the responsibilities access to what they need out of the old employee’s files or accounts.

This is starting to sound like an awful lot of effort, but some steps may not be necessary every time.  For instance, forensic images take a long time to make.  It’s not something that requires a person’s full attention the entire time, but if there is any question about illegal activity or a future lawsuit over proper compensation and wages then preserving evidence will go a long way to helping the company through the ordeal.  However, this cannot be done after-the-fact, it can only be done at the time the employee first leaves.  Once someone else is using that computer the evidence is tainted and won’t carry as much weight, if any, during litigation.  But, if there isn’t any concern about future litigation or previous illegal activity then a forensic image may not be necessary.  And, most companies these days do a decent job of limiting their employees’ use of unapproved products and services, so things like random cloud accounts aren’t usually a big issue.

Ultimately it’s up to the person or people in charge to decide how to handle a given situation, but every company should have a standard set for how to deal with employee terminations, and in the world we live in this must include the IT department in the process.  Whatever procedures are decided upon should be well documented, and every manager or person with authority should know them and be able to reference a single standard template.  Without good policies and procedures in place that include the IT department a company could end up losing face in the eyes of their customers and partners or leaving itself open to an extensive amount of damage.
Dealing with data storage has become one of the core functions of every employee in every company, worldwide. While everyone else is creating data, it’s the job of IT departments and outside IT consultants to ensure that everyone in the company has access to their data in an accessible manner, restrict access to sensitive data, and protect file storage from loss and theft. Employees are accessing and sharing data through developing and maturing technologies such as file storage synchronization (sync for short) and file sharing. These technologies are driving the change in workflow from an individualized structure to a more unified form across an organization, allowing for a streamlined way to share data and collaborate on projects.

So, what are file sync and file sharing, and how do these differ from data backups? File sync is the management of data across multiple devices and locations through replication and mirroring. Simply put, file sync copies new files to all other devices and updates all other devices when a file is edited or deleted. File sharing is the process of allowing others to access your data storage. Everyone has already been doing this through email and physical media for decades, but more recently methods have been developed to allow for online data storage across with user-configured access controls to ensure proper security. Backups, in the simplest form, are duplicate copies of files stored elsewhere to ensure that data is never truly lost. While file sync and sharing are sometimes packaged together with online backup products and services, file sync and sharing are more about keeping your data current and allowing you to collaborate with others more easily.

File sharing can further be broken into internal and external recipients. In times gone by internal file sharing was usually handled through emails between employees or with shared storage in the company network. Each person would work on a file individually, and files could only be edited by one person at a time. This certainly worked but was usually a messy and confusing workflow. Either multiple versions of a file resulted in confusion among the team, or the work was slower than it could have been since each person had to make their edits and then everyone had to spend time compiling those into a finished product. Furthermore, because of all the different versions of a file data backups were larger than they should have been, creating additional data management issues and expense. External sharing was usually handled via email or FTP (file transfer protocol) solutions. Email worked fine for small files, but large files required and FTP solution to be transferred across the internet. FTP solutions are an added and ongoing expense, whether in-house or through a service, and the FTP protocol has had its ups and downs regarding standardization and security. And, the same workflow issues arose when sharing data externally that slowed down productivity and made backups difficult.

File sync and  online backup services have been established for some time, and are now maturing into solid business solutions. And, the maturing cloud infrastructures available are seeing price drops and performance boosts as the tech industry moves forward. The major services currently include Dropbox, Google Drive, Microsoft Onedrive, and SugarSync. Additionally, Owncloud is establishing a foothold in the market for those that want or need custom solutions. Amazon’s AWS platform does not offer file sync on its own, but because of the robust platform and development capabilities it is well worth consideration. The following table compares the key features to consider when searching for the best fit for your needs.

cloud storage table

Dropbox has come a long way, but has had some notable issues with security over the years. Also, they have struggled to meet stringent industry compliance and regulations. That said, Dropbox has certainly improved over the years and is the only one of these services to offer LAN-sync, which greatly improves performance inside your network. Google Drive has also found it difficult to meet compliance standards because of their terms of service. The fact that they have been known to scan your data for advertising purposes and the issue of data ownership have put them low on the list for organizations with strict privacy requirements. Furthermore, for a company that offers extremely fast internet service Google has been known to limit bandwidth and be slow in general to use. Microsoft’s services were once quite terrible, but as they have matured they have come into their own. Decent connection speeds, good compliance standards, and reasonable prices, and no development requirements up front make OneDrive a good all-around choice for most. SugarSync is a bit price and lacks clear compliance certifications, but is the only pre-made service to offer file sync of files stored anywhere on your computer. OwnCloud and Amazon are both great choices if you need completely customized solutions, but the effort required up front to implement make these services good only if you are willing to find 3rd party software or build your own to get the features you want. Furthermore, there are so many variables to custom solutions that the cost and performance can be too much to handle for many organizations.

Ultimately, which cloud storage is best for you depends on your needs. And, there are many more online file sync services out there that may have the one feature no one else has. As with everything in IT, the key is narrowing down the options based on the core needs of your organization, then researching that short list for the details that can make the difference. Hopefully, this article has helped you feel confident that you can ask the right questions and make the right decision.
As everyone now knows, securing a network is important.  Companies of every size and consumers alike have options for doing so, but for a long time securing a network was only an option for those that had significant resources to invest in such things.  As technologies have progressed decent security settings have become a standard set of features for every router and computer, but for small businesses this can seem like the limit of their budget.  Setting up robust firewalls or using monitoring services still requires a significant investment, and so most small businesses rely on the built-in firewalls and password options that come with their devices.  There are open source solutions like pfSense and commercial solutions like the ones offered by Sonic Wall and Zyxel.  So, which one should you choose?

First, it’s important to understand what open source really means.  In a standard business model companies make software and hardware, which costs money, with the end goal of making more money than they spent, i.e. profit.  That profit can end up in any number of places, but one that is important to the continued success of any company is to maintain the product through customer service, technical support, and maintenance and development.  This is the benefit of commercial solutions.  However, companies must protect their intellectual property, and so are not always willing to let others see some or all of the source code behind their software.  This can lead to slow development of features and patches, and with only so many developers working on a product detection of bugs and security flaws can be lacking.  Open source software is freely distributed, usually donations are accepted to help maintain the project, and anyone that wants to volunteer to work on something for that project can.  The benefits to open source are the reverse of commercial solutions, with faster updates and patching at the expense of little or no unified tech support.  There are usually message boards devoted to a project like an open source firewall, but this is not direct and dedicated tech support.

pfSense sports a robust feature set and can be configured simultaneously for DNS, DHCP, Routing, Firewall, VPN, High Availability, Load Balancing, Traffic Shaping, Captive Portal, UTM server, Intrusion Detection, Intrusion Prevention, Proxy server, and Web Content Filtering.  This means that anyone can have large network security only for the cost of the hardware it runs on.  Second, the hardware requirements are quite low.  A pfSense server can be created from one of the old computers a small business usually has sitting in the closet.  A few low-cost upgrades might be in order such as RAM or dedicated network cards, but otherwise that old computer is ready to go as-is.  Being open source router pfSense might seem like a great option but for the lack of tech support, but Netgate has closed the gap with their pfSense Gold package.  For $99 a year you can have tech support, access to ongoing resources and training videos, an actual manual, and a backup service for your pfSense instance(s).  That’s not a bad deal at all, and is well worth it for novices and experts alike.

Digging into the nuts and bolts, the configuration options are extensive.  The open source firewall options on pfSense can be configured for granular access control, and the VPN offers IPSEC or L2TP security and will even integrate with Windows Active Directory.  The intrusion detection and prevention offers standards like IP blacklisting and Snort-based packet analysis, and there is an emerging threats database that can be enabled.  The only drawback to the IDS/IPS is that these are free addon packages, but if you want the most current updates on-the-fly you will have to subscribe and pay for them.  The list of addon packages for pfSense is lengthy as well.  Catagories include security, network management, monitoring, services, system, routing, and miscellaneous.  Most of these are self-explanatory but services refers to adding functions that are not necessarily for networking, such as data backups or cron scheduling.  Miscellaneous packages are just that, and out of this category the Notes and Sarg packages are the most notable.

Since Netgate’s acquisition of the pfSense project, they have also started designing and selling their own hardware appliances with pfSense integrated on-chip.  Instead of buying a computer and having to deal with hardware maintenance and upgrades, you can buy whichever model suits your company’s needs and still get all of the features pfSense offers.  Also, included with the purchase of any model is a 1 year membership to pfSense Gold.  With prices starting at $150 for a passthrough box, this is a great option if you are implementing a new network or segment.  Some may find the lack of control over the hardware to be a drawback, while others will find these appliances to be a cheaper and easier way to implement routing and security.

On the commercial side of firewalls (pun intended), Sonicwall, Zyxel, and Cisco all offer reasonably prices solutions for small businesses.  The Cisco ASA line has long been a standard for VPN and firewall routing, and the others offer much the same in terms of features.  You get the same basic features in any of these products:  firewall, VPN, routing (in some), and usually some basic logging functions.  The main issue is that while these products do a pretty good job of securing a network (if configured properly), they don’t usually offer the extensive configuration options or robust logging without spending a lot of money.  Prices for what a small business would need will range from $150-$500, which isn’t too bad.  Still, the Netgate products are just a better sell of price vs. features, and from a tech perspective having the more robust intrusion prevention and detection that pfSense offers is a must.  Most small businesses shouldn’t and won’t spend what it takes to get that from a commercial solution.

Having an open source option that has also been reasonably commercialized puts network security squarely in the hands of everyone.  Now there’s no reason that any small business should ever be able to use high costs or a lack of available options to excuse a lack of security.  And, if you still feel that open source firewall isn’t the way to go there are reasonably priced commercial solutions that will get the job done too.
In today’s world Skype service is considered a dinosaur rather than a benchmark of progress and quality. At the time of its release Skype was at the peak of technology and immediately attracted attention of a huge number of users. However, today new services such as Viber and WhatsApp have caught up and overtaken Skype in the number of users as well as in other criteria: ease of registration, search for contacts and data protection.

A bit of history


In order to understand what Skype was for its time, we must remember what level of technology there was at the moment of its release. And so, it is autumn 2003. There haven’t been any modern smartphones yet. PDA’s (palm handhelds with Windows Mobile and limited capabilities) have just started gaining popularity in the world. Laptop autonomy and performance are less than modest, most people still use desktops. WiFi features are also rather limited and a leased line of 128 kbits/s is considered luxury, unless you live in the capital. Modems are still running the show in the periphery. The whole world uses text messengers – ICQ and MSN Messenger. Facebook has not yet been invented. The concept of social networking is still somewhere in the bud. Computer services for PC modernization are starting to appear.

At this point a voice communication service is coming out, free for everyone. All you need is to download and install a client program and go through a pretty simple registration process. Everyone installs it, they tell each other their skype names, test it and… It’s a miracle! It simply works!

Skype revolutions and its fundamental difference was something that later became its Achilles’ heel – the architecture. The company based the service on a decentralized P2P architecture, where each user’s computer with the installed and running client program and having “white” IP was converted into a router for the nearest Skype users behind NAT. This made it possible to scale the network up to unimaginable size of tens of millions of users without expensive equipment and quality loss. The users of Torrent, the basis of which is the same P2P principle, are well aware that the more people are involved in downloading (and sharing at the same time) of the same file, the higher is the speed of access to the file overall. Thus each owner of at least a “piece” of the file is immediately sharing it with the nearest neighbor, while receiving another “piece” from another neighbor.

What will it be?


A few days ago there was a post on the company’s official blog, “Skype – the journey we’ve been on”, which describes a new strategy and a new platform for a further development of the service.

The blog states that all recent developer’s activities have been focused on the complete rejection of P2P-architecture and the transition to a modern cloud technology. Among the advantages of the new approach, there are pre-existing features such as file transfer and video messaging, which have undergone substantial modernization, as well as some new products such as group video, a translator, and the so popular these days bots. Besides, the rejection of the old architecture should allow Microsoft to rid Skype of a number of issues on mobile devices such as message transfer rate, synchronization between devices and push-notification.

After the transition Skype promise to focus on the quality of calls and some new features, the essence of which is not disclosed. For users of systems left behind, they promise support and development of an updated web version, which will of course be cloud-based.

However


Skype, Viber, WhatsApp, iMessage, FaceTime, Telegram, Allo, Duo, Hangouts, Mail.Ru Agent, WeChat, QQ… Are there too many messengers? Are there too many proprietary protocols? After all, the basic functionality of all these services and applications is duplicated. Instead of a universal and open protocol, which would have had its security improved by the world community, where third-party developers would have been responsible only for the convenience and functionality of client applications, we got all this wild zoo of protocols and standards, which forces us to keep multiple instant messengers running on our smartphones at the same time, at the expense of convenience and comfort. Time will show.
© 2018 - Allora Consulting