Glad to present to you new HBS version — 0.2a, which main work was targeted on incrase a hashcracking process efficiency.
In this version added search duplicates of found hashes in other hashlists with same alg (not only target hashlist). This process run after work task done, and when new hashlist was loaded.
Added automatically build common hashlists with all non-cracked hashes in database separated by alg. User can`t edit or delete it, only run work tasks with it. This lists may help you when you have some big lists with same alg, and want start one set of tasks by every list of them. Now you must not create a needed tasks for every hashlists, you may create one tasks set for all hashes together. When work will done, all found hashes will put in every hashlists with this alg and common hashlist will be updated (found hashes will deleted from it)
All HBS work was parallelized. Now next work doing independently: parsing and place in DB hashlists, parsing results of Hashcat work with search found hashes in other hashlists, building common hashlists separated by algs. As a result, work tasks now working right along.
Now hashlists put in DB other «LOAD DATA LOCAL INFILE» construction, and it may possible load big hashlists with hundreds of thousands hashes and more.
Other small additionals:
- Hashlists get ‘status’ field. Work task by it may be created after hashlist parsed and has ready to work status.
- User can select status of work task when create it (no always stopped after create)
- Hide done task possibiliry (web-interface)
- Possibility set negative priority to work task
- Script cron.py renamed on hbs.py
- Out-files not remove from HDD and DB after parsing
- Fixed UTF8 passwords bug in already founded hashes procedure
- Forbidden hashlists delete which in work now
- Fixed dict delete bug (file was not deleted with dict db record)
- Fixed bug of 0 priority when user cancel priority set dialog
- Fixed \r bug in hashlists
Topic on team forum: http://hack4sec.pro/forum/viewtopic.php?f=5&t=18