mysql查询筛选重复数据sql语句
2019/10/10/17:36:31 阅读:2884 来源:谷歌SEO算法 标签:
DNS
以前讲过大量的重复数据过滤语句,下面小编来给大家介绍一些自己收藏了查询筛选重复数据sql语句,希望对各位朋友有所帮助.
查询重复数据数量,代码如下:
select device_id from device group by device_id having count(device_id) > 1;
查询所有重复数据,代码如下:
- selectuserid,device_id,create_datefromdevicewheredevice_idin(selectdevice_idfromdevicegroupbydevice_idhavingcount(device_id)>1)orderbydevice_id,create_datedesc;
重复一条中create_date 最新的那一条,代码如下:
select max(create_date) from device group by device_id having count(device_id)>1;
筛选查询,代码如下:
- select*fromdevicewheredevice_idin(selectdevice_idfromdevicegroupbydevice_idhavingcount(device_id)>1)andcreate_datenotin(selectmax(create_date)fromdevicegroupbydevice_idhavingcount(device_id)>1)orderbydevice_id,create_datedesc;
下面再看一些实例吧.
表结构如下,代码如下:
- mysql>desctest1;
- +--------------+------------------+------+-----+---------+----------------+
- |Field|Type|Null|Key|Default|Extra|
- +--------------+------------------+------+-----+---------+----------------+
- |ID|int(10)unsigned|NO|PRI|NULL|auto_increment|
- |SENDERNAME|varchar(32)|YES||NULL||
- |RECEIVERNAME|varchar(64)|YES||NULL||
- |SENDER|varchar(64)|NO||NULL||
- |RECEIVER|varchar(64)|NO||NULL||
- |SUBJECT|varchar(512)|NO||NULL||
- |CONTENT|text|NO||NULL||
- |PRIORITY|int(11)|NO|MUL|NULL||
- |STATUS|int(11)|NO|MUL|NULL||
- |CREATETIME|datetime|NO||NULL||
- |SENDTIME|datetime|YES||NULL||
- +--------------+------------------+------+-----+---------+----------------+
subject和RECEIVER,需要做uniq key,但设计时未做,,后面的数据就有很多重复的记录.
1.查询需要删除的记录,会保留一条记录,代码如下:
- selecta.id,a.subject,a.RECEIVERfromtest1aleftjoin(selectc.subject,c.RECEIVER,max(c.id)asbidfromtest1cwherestatus=0GROUPBYRECEIVER,SUBJECThavingcount(1)>1)bona.id<b.bidwherea.subject=b.subjectanda.RECEIVER=b.RECEIVERanda.id<b.bid
2.删除重复记录,只保留一条记录,注意,subject,RECEIVER 要索引,否则会很慢的,代码如下:
- deleteafromtest1a,(selectc.subject,c.RECEIVER,max(c.id)asbidfromtest1cwherestatus=0GROUPBYRECEIVER,SUBJECThavingcount(1)>1)bwherea.subject=b.subjectanda.RECEIVER=b.RECEIVERanda.id<b.bid;//phpfensi.com
好了筛选重复数据的sql语句大概就这些了,如果你能理解那几乎不担心重复数据这一说了.
热门评论