Article From:

1、First, open the Kali tool, the Web information collection tool HTTrack, which will appear to use syntax directly.

        For a traditional web site like Robots.txt, if the program runs without restrictions, the program does not mirror the site in the default environment. Simply, HTTPrack follows the basic JavaScript or APPLet, flash links,Complex links (using links created by functions and expressions) or server-side ImageMap can not be mirrored.

      In general, you can get target information without digging too deep, such as the physical address of the website, telephone number, e-mail address, operation time, business relationship, employee name, social relationship, and some public anecdotes. When doing penetration testing, news is also very important. The company will disclose some feelings for itself.Proud things, these reports may reveal useful information, and the operation of enterprise merger servers.

       Of course, HTTrack has an interface class version supporting Windows system.

2、Explain the general parameter options

      Enter project name  //Enter the name of the project, and the program automatically generates a local project name.

      Enter URLs (separated by commas or blank spaces)  //Site address to be grabbed

    Note that the parameter operation in Action is:

(enter) 1 Mirror Web Site(s)   Mirror site
2 Mirror Web Site(s) with Wizard  Mirror website and guide
3 Just Get Files Indicated  Only obtain the documents declared in the file
4 Mirror ALL links in URLs (Multiple Mirror)   All linked multiple mirrors in URl
5 Test Links In URLs (Bookmark Test)    Bookmark test
0 Quit    Sign out

Proxy (return=none) : If no agent does not choose agent

You can define wildcards, like: -*.gif +www.*.com/*.zip -*img_*.zip
Wildcards (return=none) :      
//Download with the wildcard, I’ll go straight back to the car

3、Installation of HTTPrack, if Kali, the system directly integrated the tool, LinuX or other Unicorn systems can not use apt -get install installation.

4、A specific peeling site is as follows

    First, I take my blog as an example to illustrate how to use this tool through cloned images of my blog.

 After the whole web page is picked out, it may be part of a mistake, prompting warning, and errors information, to collect a total of the number of pages linked, as shown after the collection.

  Open some of them.

From the above two mirrored pages, you can see that under the default settings, the mirrored web page will automatically cut the framework of the web page, and the content of the blog is just a list of the outline. But by setting filter parameters, you can also filter the URLs linked to different levels in the website.

 Mirrored pages can be done offline, permeate analysis, browse web pages on a web host, and the more you browse and grope for, the more your activities may be tracked by the web site, even if you are free to browse the website, it will also be traced, as long as it belongs to the target resource and directly interact with it at any time.There may be a trace of digital.

   But don’t use the software on a mirror web site without authorization, like a professional software that deploys a safe dog or other firewall, and may record this behavior as an attack. So what I’m demonstrating is mirroring my own web page.

Then, if there is a loophole in the test page, you can infiltrate the website.

After building a web site on your own host, it uses the above method to test it, and the web page of the main station is penetrated and tested using OWASP Id=1& Submit=Submit


Link of this Article: HTTrack (website mirroring tool)

Leave a Reply

Your email address will not be published. Required fields are marked *