For Bisnow partner VTS, its viability as a platform depends on its ability to keep users' information secure. As commercial real estate's biggest leasing platform, VTS has not only the incentive, but the resources to build one of the most comprehensive and innovative cybersecurity systems in the industry. In fact, the company just received a stellar SOC 2 report, which gauges a service provider's security level.
According to VTS' director of information security, Robert Lowry, this report—handled by an independent firm—not only assures customers that the platform's systems and processes have been put in place to properly protect their data and are working properly, but shows VTS's confidence in its security team.
Robert has spent his entire career in the field of Information Security and Information Assurance at Nasdaq, New York's Federal Reserve Bank and the Department of Defense. We picked his brain to create a list of practices VTS takes to help protect its clients data.
To learn more about our Bisnow partner, click here.
1. Audits and Transparency
Robert says one of the most important aspects of the SOC2 process is the fact that it's an independent, third-party audit, which can give customers peace of mind and give VTS itself validation that it's operating correctly.
He says one of the first things the SOC2 operators check is the level of permissions given to users. These layers of permission mitigate the risk of employees and clients—the two biggest threats to your cybersecurity—accidentally giving out vital information and security clearance to threats.
2. Online Portals Protect Property
Robert says VTS's customers enter data on their own into the platform, or the company's customer success teams can assist with initial onboarding or by integrating with an accounting system such as Yardi or MRI.
During this initial integration process, VTS warns customers not to email the information, but rather use an online portal to securely send the data for manual entry.
"For customers who choose to integrate an accounting system, they upload only the needed information to a secure FTP server, which we then process and import the data into VTS through our robust and secure integration system," he says.
3. Static Code Analyses On Every Code Branch
Static code analysis simply means analyzing your source code without executing it to find potential security threats. It scans all the code, so if there are any vulnerabilities in the nooks and crannies of your application—even unused ones—a static code analysis will find them.
4. Monthly Web Vulnerability Scans
A web vulnerability scanner communicates with a web application like VTS through the front end in order to locate possible vulnerabilities and architectural weaknesses. Even though they don't have access to the source code itself, these scanners identify risk areas by actually performing attacks. Hackers use the same tools, so if the tools can find a weakness, so can the intruders.
5. Mandatory Code Reviews For Every Pull Request
A code review is a comprehensive examination of computer source code. It often identifies and removes the common vulnerabilities that can arise when someone submits a new addition to the source code, known as a pull request.
6. Annual Third-Party Penetration Tests
Much like the aforementioned scans, third-party penetration tests and code reviews are to make sure the VTS platform's technical implementation are secure and robust by manually looking for paths by which a hacker might gain access to a computer or network server. VTS commissions these tests on a yearly basis.
The traditional patch is no longer the most modern or effective updating approach, Robert says.
"Every time VTS updates its code a new server's deployed with the latest operating system and application codebase," he says. "The old server's destroyed once there are no active connections."
Any old, unused servers are potential access points for hackers, so destroying them upon their obsolescence greatly reduces the risk of an intrusion.
At VTS, this process happens automatically every 24 hours at a minimum, so our systems are all always on the latest OS. When the Linux glibc buffer overflow was announced, Robert says, this process was instrumental in quickly cycling servers without any customer downtime.