![]() |
|
#1
|
|||
|
|||
|
Hi,
I am working on finding duplicates in an excel file that contains 545000 rows. I have tried the conditional formatting method, it runs for more than an hour. Now I am trying with Countif and this also is taking more than an hour. Is there a quicker way to find duplicates in a large excel file? ![]() |
|
#2
|
|||
|
|||
|
What is considered a duplicate ?
- all cells of one row matching all cells of another row ? - matching cells in one particular column ? - something else ? Quote:
Can you attach a small de-sensitized sample worksheet of what you're dealing with, maybe 100 rows or so ? PS: Excel 2003 has max of 65,536 rows, you might want to change your profile to be what you actually have. |
|
#3
|
|||
|
|||
|
Maybe you start with reason, why you want those duplicate rows to be marked.
Do you want to delete abundant ones (a reasonable action IMHO)? Then a logical solution will be ODBC query with DISTINCT clause to get all unique entries into a separate table (the amount of rows in original table taken into account, a separate workbook will be reasonable solution). After this you can scrap the original one, or delete the sheet with original table and then move the new table into old workbook. |
|
|
|
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
Merge duplicate rows but retain data from one column
|
Willem113 | Excel | 1 | 09-21-2016 05:42 PM |
Excel Newbie Question: .dbf file import help - 1 mil+ rows
|
LisaExcelHelp | Excel | 1 | 01-26-2015 07:39 AM |
Huge table needs to link rows and column in a special manner (when not equal to 0)
|
melieetnala | Excel | 1 | 08-12-2014 05:56 AM |
Macro to Delete Duplicate Rows and Retain Unique Value
|
expert4knowledge | Excel Programming | 1 | 02-17-2014 08:02 PM |
Find Results in excel copy the rows to another sheet
|
khalidfazeli | Excel | 2 | 02-06-2013 09:38 AM |