django used to develop the project
While managing and performing file operations on computer or on other storage devices, many duplicate files with a considerable size will be gathered on the computer. Accumulation of these digital junk levels can be a primary cause for shortage of storage space and decrease in computer performance. Therefore, there is a need to search and erase duplicate files from computer hard drive. Sometimes it is necessary to have information about such files that has replicas. If duplicates of a requested file are present on your computer, all will be placed in RAM hence it may cause your system to slowdown. We use a data deduplication technique in which, whenever a file is uploaded, the system starts checking the checksum, and the checksum verifies checksum information put away in the database. If the file exists, at that point it will refresh the section else it will make another passage into the database. Duplicate File searcher and Remover will help you reclaim valuable disk space and improve data efficiency. Deleting duplicates will help to speed up indexing and reduces back up time and size. It can quickly and safely find the unwanted duplicate files from the system and then delete or move the duplicate files to separate folder, according to the user requirement. The duplicates will be removed from your system.
- Admin:
- User:
• Manage File: Admin can Manage files by deleting, updating and adding.
• Manage Users: Admin can Manage user by blocking, unblocking
• Upload File: User can Upload File.
• View File: User can view the upload file.
• Logout: User can logout.
- Processor –Core i3
- Hard Disk – 160 GB
- Memory – 1GB RAM
- Windows 7 or higher
- Python 3
- Django framework
- MySQL database
- International Journal of Research Studies in Computer Science and Engineering (IJRSCSE)
- https://nevonprojects.com/data-duplication-removal-using-file-checksum/