Home Page
IP Address
JavaScript
WebRTC Leak Test
Canvas Fingerprint
WebGL Report
Font Fingerprinting
SSL Client Test
Geolocation API
Features Detection
Content Filters
Java Applet
Flash Player
Silverlight
More Tools
Settings
It has long been believed that IP addresses and Cookies are the only reliable digital fingerprints used to track people online. But after a while, things got out of hand when modern web technologies allowed interested organizations to use new ways to identify and track users without their knowledge and with no way to avoid it.
BrowserLeaks is all about browsing privacy and web browser fingerprinting. Here you will find a gallery of web technologies security testing tools that will show you what kind of personal identity data can be leaked, and how to protect yourself from this.
Mise en place de la mise à jour automatique de liste de blocage DNS sous Windows 10
Website logo
Home
Frequently asked questions
Contact
My account
WFC screenshot
Windows Firewall Control - Managing Windows Firewall is now easier than ever
Program Overview
URLhaus
URLhaus is a project from abuse.ch with the goal of sharing malicious URLs that are being used for malware distribution.
Distro_debian
Last edited by Steve Beattie 2 years ago
In Stock Debian
AppArmor should be available out of the box in the latest Debian distros. Please see http://wiki.debian.org/AppArmor
To enable the AppArmor in the Debian kernel, add “security=apparmor” to the kernel parameters, like this:
sed -i -e 's/GRUB_CMDLINE_LINUX_DEFAULT="/&security=apparmor /' /etc/default/grub
This sed command results in the following /etc/default/grub line on my system:
GRUB_CMDLINE_LINUX_DEFAULT=“security=apparmor quiet”
Then run
update-grub
Experimental AppArmor on Debian Jessie amd64
Kernel
Obtaining
mkdir -p ~/apparmor/ && cd ~/apparmor/
wget https://www.kernel.org/pub/linux/kernel/v3.x/linux-3.10.2.tar.xz
tar -xJf linux-3.10.2.tar.xz
cd linux-3.10.2/
Building
cd ~/apparmor/linux-3.10.2/
See if we can reuse the existing kernel configuration (CONFIG_IKCONFIG=y, CONFIG_IKCONFIG_PROC=y):
cp /proc/config.gz ./ && gzip -d config.gz
Tweak the kernel, enable AppArmor:
apt-get install libncurses-dev
make menuconfig
“Security options” ---> “AppArmor support”, “Enable AppArmor 2.4 compatability”
Installing
aptitude install dpkg-dev bc
cd ~/apparmor/linux-3.10.2/
make deb-pkg
dpkg -i ../linux-firmware-image_{version}.deb
dpkg -i ../linux-headers-{version}.deb
dpkg -i ../linux-image-{version}.deb
If the kernel is installed on another host, then symlinks for DKMS should be fixed.
rm /lib/modules/{version}/build; ln -s /usr/src/linux-headers-{version} /lib/modules/{version}/build
rm /lib/modules/{version}/source; ln -s /usr/src/linux-headers-{version} /lib/modules/{version}/source
About dpkg -i ../linux-libc-{version}.deb: /usr/include/x86_64-linux-gnu/asm seems to be missing from latest linux-libc-{version}.deb. If you installed linux-libc-{version}.deb, you can downgrade to Debian version with aptitude install linux-libc-dev=3.0.0-3.
Finally:
update-grub
Checking
Reboot under new kernel:
/sbin/shutdown -r now
or
reboot
Now see if AppArmor is loaded and enabled (should print “Y”):
cat /sys/module/apparmor/parameters/enabled
Tools
aptitude install apparmor apparmor-profiles
/etc/init.d/apparmor restart
Checking
cat /var/log/audit/audit.log | grep apparmor_parser
should display something like
type=AVC msg=audit(1316949034.097:108): apparmor=“STATUS” operation=“profile_load” name=“/bin/ping” pid=5207 comm=“apparmor_parser”
Tuning logs
Audit data by default is dropped into /var/log/messages via rsyslogd. That way, the data is severely capped by the kernel in order not to overload the messages log. To make audit data usable with AppArmor we should install auditd and tune it to keep large amounts of data:
apt-get install auditd
sed -i -re 's/max_log_file = [0-9]+/max_log_file = 200/' /etc/audit/auditd.conf
/etc/init.d/auditd restart
Sécuriser OpenSSH
21 Aug 2020
Sshd est le processus du serveur OpenSSH.
Il écoute les connexions entrantes à l’aide du protocole SSH et agit comme serveur pour le protocole.
Il gère l’authentification des utilisateurs, le chiffrement, les connexions de terminaux, les transferts de fichiers et le tunneling.
Home SSH Hardening Guides Contact
About
This free tool audits the configuration of an SSH server or client and highlights the areas needing improvement.
Too many admins overlook SSH configuration when setting up new systems. Unfortunately, the defaults for many operating systems are optimized for compatibility, not security.
To see a sample report, click here.
Multiple vulnerabilities found in Wireless IP Camera (P2P) WIFICAM cameras and vulnerabilities in custom http server
TL;DR: by analysing the security of a camera, I found a pre-auth RCE as root against 1250 camera models. Shodan lists 185 000 vulnerable cameras. The "Cloud" protocol establishes clear-text UDP tunnels (in order to bypass NAT and firewalls) between an attacker and cameras by using only the serial number of the targeted camera. Then, the attacker can automaticaly bruteforce the credentials of cameras.
Product Description
The Wireless IP Camera (P2P) WIFICAM is a Chinese web camera which allows to stream remotely.
shhgit finds secrets and sensitive files across GitHub (including Gists), GitLab and BitBucket committed in near real time.
secu ssh git password leaks
IP Leak Test DNS Leak Test What is my IP?
DNS leak test
What is a DNS leak?
What are transparent DNS proxies?
How to fix a DNS leak
We collect and share information about different bots (user-agents) that you may see visiting your site. If you have noticed a bot that you are not familiar with, search our database of bots. We list many bots that were reported as bad bots and provide as much information as we can about the bots we list.
The Mozilla Observatory has helped over 170,000 websites by teaching developers, system administrators, and security professionals how to configure their sites safely and securely.
Scan your site
Exploits of the week
Online Student Enrollment System 1.0 - Cross-Site Request Forgery (Add Student)
Code Blocks 20.03 - Denial Of Service (PoC)
WebPort 1.19.1 - 'setup' Reflected Cross-Site Scripting
WebPort 1.19.1 - Cross Site Scripting
Frigate 2.02 - Denial Of Service (PoC)
Responsive Online Blog 1.0 - 'id' SQL Injection
Trend Micro Web Security - Remote Code Execution
Lansweeper 7.2 Default Account / Remote Code Execution
Student Enrollment 1.0 - Remote Code Execution
FileRun CVE-2019-12905 - Cross Site Scripting
About us
Report URI was founded to take the pain out of monitoring security policies like CSP and other modern security features. When you can easily monitor what's happening on your site in real time you react faster and more efficiently, allowing you to rectify issues without your users ever having to tell you.
Our platform is constantly evolving to help you, our users, better protect your users.
Robots.txt
Introduction au protocole d'exclusion des robots
Le protocole d'exclusion des robots, plus connu sous le nom de robots.txt, est une convention visant à empêcher les robots d'exploration (web crawlers) d'accéder à tout ou une partie d'un site web.
Le fichier robots.txt, à placer la racine d'un site web, contient une liste de ressources du site qui ne sont pas censées être explorées par les moteurs de recherches. Ce fichier permet également d'indiquer aux moteurs l'adresse du fichier sitemap.xml du site.
Par convention, les robots consultent le fichier robots.txt avant d'explorer puis d'indexer un site Web. Lorsqu'un robot tente d'accéder à une page web, comme par exemple http://www.mon-domaine.fr/page.html, il tente d'accéder en premier lieu au fichier robots.txt situé à l'adresse http://www.mon-domaine.fr/robots.txt
Comment créer le fichier robots.txt
Le fichier robots.txt doit être placé obligatoirement à la racine de votre site ; si celui-ci est placé dans un répertoire par exemple, les moteurs ne le découvriront pas et ne suivront pas ses règles et paramètres. Par ailleurs, le fichier doit être d'une taille inférieure à 62 Ko (cf. Taille maximale du fichier robots.txt).
Si le domaine de votre site est http://www.mon-domaine.fr/, le fichier robots.txt doit se situer obligatoirement à l'adresse suivante http://www.mon-domaine.fr/robots.txt
A noter qu'il est également nécessaire que le nom du fichier robots.txt soit en minuscule (pas de Robots.txt ou ROBOTS.TXT).
Voir également les ressources pour la création d'un fichier robots.txt sur un serveur en HTTPS ou dans le cas de sous-domaines.
Le contenu du fichier robots.txt
Exemple de contenu d'un fichier robots.txt :
User-agent: *
Disallow: