Skip to main content

About Sociosploit

Stated simply, most people do not understand technology. That gap in understanding can often be exploited through uniquely tailored social engineering attacks. Sociosploit is a research blog which examines exploitation opportunities on the social web, with a focus on the unique social dynamics at the intersection of technology and social psychology. Sociosploit is an academic research initiative focused specifically on understanding security risks and exploitation potential of the social web. Particular topics that we will be addressing include:
  • Information Security
  • Privacy
  • Misuse of Social Network Platforms
  • Social Engineering
  • Other miscellaneous hacking stuff...

$~ Whoami

Hutch (Sociosploit) -- Security Researcher
Justin Hutchens (“Hutch”) started his information security career in the United States Air Force, and now oversees the execution of red team assessments, penetration tests, and attack simulations -- as the Assessments Services Practice Lead at Set Solutions. He is the co-host of the "Ready, Set, Secure" InfoSec podcast. Hutch has spoken at multiple conferences to include DEFCON, ToorCon, Texas Cyber Summit, and HouSecCon. He has a Master's degree in Information Systems and multiple information security certifications to include OSCP, GPEN and GWAPT. 

Twitter --> https://twitter.com/sociosploit
LinkedIn --> https://www.linkedin.com/in/justinhutchens/

Comments

Popular posts from this blog

Bootstrap Fail - Persistent XSS via Opportunistic Domain Sniping

This is the story of how a failed Bootstrap implementation on a website allowed me to gain JavaScript code execution into thousands of user browsers. How I Found It? Before I get into the story, I'll quickly explain how I found this vulnerability in the first place.  I have started developing a new opportunistic approach for acquiring persistent XSS (Cross Site Scripting) on various web-services across the Internet.  This methodology consists of the following steps: Use custom web-crawler to spider web services across the Internet and scrape source code. It iterates through IP addresses and hits the web-root content for every IP address. It then identifies websites that are using externally hosted JavaScript. This is achieved for each server by… Reviewing the HTML source code for <script> tags with a source (src) value containing a full web-address (rather than a local path). An example would be <script type='text/javascript' src='https://domain.name/path/to/ho

Bypassing CAPTCHA with Visually-Impaired Robots

As many of you have probably noticed, we rely heavily on bot automation for a lot of the testing that we do at Sociosploit.  And occasionally, we run into sites that leverage CAPTCHA ("Completely Automated Public Turing Test To Tell Computers and Humans Apart") controls to prevent bot automation.   Even if you aren't familiar with the name, you've likely encountered these before. While there are some other vendors who develop CAPTCHAs, Google is currently the leader in CAPTCHA technology.  They currently support 2 products (reCAPTCHA v2 and v3).  As v3 natively only functions as a detective control, I focused my efforts more on identifying ways to possibly bypass reCAPTCHA v2 (which functions more as a preventative control). How reCAPTCHA v2 Works reCAPTCHA v2 starts with a simple checkbox, and evaluates the behavior of the user when clicking it.  While I haven't dissected the underlying operations, I assume this part of the test likely makes determination

Building Bots with Mechanize and Selenium

The Sociosploit team conducts much of its research into the exploitation of social media using custom built bots. On occasion, the team will use public APIs (Application Programming Interfaces), but more often than not, these do not provide the same level of exploitative capabilities that could be achieved through browser automation. So to achieve this end, the Sociosploit team primarily uses a combination of two different Python libraries for building web bots for research. Each of the libraries have their own advantages and disadvantages. These libraries include: Mechanize Pros: Very lightweight, portable, and requires minimal resources Easy to initially configure and install Cuts down on superfluous requests (due to absense of JavaScript) Cons: Does not handle JavaScript or client-side functionality Troubleshooting is done exclusively in text Selenium Pros: Operations are executed in browser, making JavaScript rendering and manipulation easy Visibility of browse