TZ Security provides wide range of web security services:
- malware detection and cleaning of compromised sites and servers (custom scripts and scanners + manual check)
- daily monitoring, file integrity checking and overall protection of clients sites & servers (custom script)
- proactive security and hardening against further intrusions and hacking attacks (php.ini, .htaccess, firewall script)
- website optimization for security, speed and overall performance (.htaccess)
- active researching and daily updating for new malware variants and heuristic patterns to prevent unknown attacks
- deobfuscating, decoding and analyzing encrypted malicious codes and scripts, making patterns for further protection
- server & web applications vulnerability researching, security audits by request (web servers & web applications only)
We work on any type of sites based on PHP / HTML, anything from custom PHP sites and popular Content Management Systems to eCommerce sites, forums and web frameworks. To mention a few: WordPress, Joomla, Drupal, PrestaShop, Opencart, osCommerce, Magento, vBulletin, myBB, mediaWiki, phpBB.
Our standard malware removal procedure
To start process, we need login access for cPanel, FTP, database and website admin panel. Access to website is blocked for everyone until we finish the cleaning process. Custom “work in progress” page will be set so visitors know that site will be back in few hours.
step 1 – full website and database backup, state before any malware removal. On that way we are protecting client’s files if hosting company decide to remove website in order to protect other sites on shared server, or if hacker at some point decide to erase everything. Also, we are protecting ourselves if some issue arise (for example, if client say that we broke his site, or payment for our services is reversed, we will upload backup made before cleaning)
step 2 – Server Scanning – we use our custom coded scanner with excellent pattern database.
Standard scanning process include searching for:
– extensive list of known file names of web shells, ddos scripts and other hacker tools (including names found in the “wild”)
– malware patterns (our private database of patterns and specific function names taken from over the 12.000 different web shells and their variants, uploaders, mass spam mailers, bruteforcers, exploit pack panels & files, sqli/xss/rfi/lfi exploit scanners, ransomware panels, booters, ddos shells, and other types of malicious files and scripts)
– phishing pages, phish-pack files uploaded by phisher, defacement notices and pages
– functions and commands used in php shells (eval, assert, passthru, shell_exec, system, popen, exec, ` , etc)
– obfuscated / encoded patterns in php, js, asp, jsp files (base64, str_rot13, gzinflate, byterun, hex, uu, unescaping, etc)
– regex pattern scanner with known and general (heuristic) signatures
– hacker nicknames and hacker group names found in thousands of shells and backdoors, defacement pages, phish-packs, etc
– spam keywords in various niches (database scanning for spam is included) – pharma, weight loss, payday loans, insurance, software, dating, seo services, etc
– specific patterns for windows servers – asp / aspx / cf malware strings
Obviously, scanning is done very thoroughly; many of patterns and functions can be used in regular scripts, lot of false positives is generated, but – there is 0% chance that something is missed, and that is our main goal.
step 3 – Checking files found in step 2. This is 100 % manual work, file by file, slower but safer process which leaves no room for mistakes. Some “services” boost “fast automatic cleaning” like mayor selling point. Can you imagine automated cleaning that is supposed to recognize malware injected in regular files and clean them correctly, with so much different platforms and files using same functions? I can’t.
step 4 – 7 day server traffic monitoring – After the initial cleaning, we will do 7 days monitoring of client’s website and observe traffic requests, visited pages, files and paths that are not usually supposed to be visited (plugin files, LFI/RFI paths, direct .php file requests), various traffic anomalies, visitor geo locations and referrers, bots, crawlers, scanners, and do a custom server hardening (php.ini and .htaccess) and optimize website for speed and general performance
After cleaning we will send you report about what we have found, possible scenario how intrusion happened on first place and what we have done about it.
Server and PHP hardening
Hardening is part of both one-time cleaning and full TZ-Security cleaning & protection service. Disabling unnecessary functions, adding security headers and blocking specific url queries we can protect client’s web applications from many generic hack attempts, sql injections, directory traversals, etc.
Note: full server hardening include steps which are not available to clients on shared servers, and we are not able to do it because we would need root access. What we can do is custom php.ini and .htaccess hardening of websites and applications. This is basic but very powerful layer of protection.
– Custom php.ini – blocked and disabled dangerous functions and options that should not be active on your server. Everything will be checked properly so nothing is blocked in your application(s).
– custom .htaccess – carefully set rules for security, server tightening, filtering out various bots and user agents, blocking specific requests characteristic for rfi/lfi/sql/xss, hotlinking prevention, etc . Additionally, we will take care of proper speed and performance optimizing of client’s site, with rules for expired headers, etags, gzip/deflate, etc… client’s sites will load faster and use less server resources.
– file permissions (CHMOD) – we will check and set correct permissions on all files and folders.
URL Filtering (PHP Firewall)
Excellent layer of security. Regularly updated standalone php script (works with any php website, CMS or framework) which is checking all GET, POST, REQUEST, COOKIE values for specific patterns and block access to site before any damage is done. We will check the type of application in use, adjust or remove some of conflicting rules (if any) so everything on site can work properly.
Protection against common (and uncommon) XSS / LFI / RFI / SQL injections, Directory Traversal attempts, scrapers, scanners, bots, crawlers and other potentially harmful & resource hogging requests. Active researching and daily updating for new malware variants and heuristic patterns to prevent unknown attacks. Additional filtering by IP ranges, user agents, hosts.
Monitoring for file changes
Daily check for any changes on server & website files. All files are scanned recursively, hash values are calculated and compared with previous clean state, each day. If something is added, removed, or changed on server / website, we will know and check/clean if necessary – incident response time is maximum 8 hours (but usually much faster).
Backup of client’s data
Manual backup of all files and database, downloaded to our storage units. We are keeping last 4 backups of each web application and corresponding database.
Handling of backups is off-site, in secured environment; we use standard 4Tb hard-disc units (same as Facebook is using), physically disconnected from any internet / intranet / local network access.
* Note: backup is optional service, invoiced separately if someone need it. Cloud backups are cheaper options, but if client prefer offline backups, we can do it.
Price for download + storage is 5 us$ for 1 backup (application+database) / month, or 20 us$ for 4 backups/month. Prices are for up to 1Gb of data (per one backup).