R - Error: Duplicate identifiers for rows

0 votes

I need to reshape a data frame but there is some problem

I have this command:

library(tidyverse)
data1 = data1 %>% gather(Day, value, Day01:Day31) %>% spread(Station, value)

And I get this error:

Error: Duplicate identifiers for rows (130933, 131029), (389113, 389209), (647293, 647389), (905473, 905569), (1163653, 1163749), (1421833, 1421929), (1680013, 1680109), (1938193, 1938289), (2196373, 2196469), (2454553, 2454649), (2712733, 2712829), (2970913, 2971009), (3229093, 3229189), (3487273, 3487369), (3745453, 3745549), (4003633, 4003729), (4261813, 4261909), (4519993, 4520089), (4778173, 4778269), (5036353, 5036449), (5294533, 5294629), (5552713, 5552809), (5810893, 5810989), (6069073, 6069169), (6327253, 6327349), (6585433, 6585529), (6843613, 6843709), (7101793, 7101889), (7359973, 7360069), (7618153, 7618249), (7876333, 7876429), (130934, 131030), (389114, 389210), (647294, 647390), (905474, 905570), (1163654, 1163750), (1421834, 1421930), (1680014, 1680110), (1938194, 1938290), (2196374, 2196470), (2454554, 2454650), (2712734, 2712830), (2970914, 2971010), (3229094, 3229190), (3487274, 3487370), (3745454, 3745550), (4003634, 4003730), (4261814, 4261910), (4519994, 4520090

The strange thing is that I also get this results:

library(dplyr)
test = rownames_to_column(data1, "VALUE")
length(unique(test$VALUE)) ### Result 258180 = Same as number of rows
length(unique(test$VALUE)) == nrow(test) #### Result TRUE

So in the error if you observer, there are errors which do not exist in my data frame

Apr 9, 2018 in Data Analytics by nirvana
• 3,040 points
693 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
+1 vote

You can try the below code as spread tries to combine the rows that are uniquely identified after gather. rowid_to_column() is a function that converts row ids to columns.

data2 <- data %>%
    gather(Day, value, Day01:Day31) %>%
    tibble::rowid_to_column() %>%
    spread(Station, value)
answered Apr 9, 2018 by kappa3010
• 2,010 points

Related Questions In Data Analytics

0 votes
1 answer

Discarding duplicate rows from a data.frame - R

You can use distinct() function along with ...READ MORE

answered May 4, 2018 in Data Analytics by Bharani
• 4,550 points
26 views
0 votes
1 answer

Error saying "duplicate 'row.names' are not allowed" when trying to setup my data for the mlogit-package

Take out the chid.var argument in your call to mlogit.data, ...READ MORE

answered Nov 12, 2018 in Data Analytics by Maverick
• 10,000 points
166 views
0 votes
2 answers

Installing MXNet for R in Windows System

You can install it for python in ...READ MORE

answered Dec 3, 2018 in Data Analytics by Kalgi
• 35,800 points
182 views
0 votes
2 answers

Transforming a key/value string into distinct rows in R

We would start off by loading the ...READ MORE

answered Mar 26, 2018 in Data Analytics by Bharani
• 4,550 points
29 views
0 votes
3 answers

How to remove NA values with dplyr::filter()

Null values have no notion of equality ...READ MORE

answered Apr 11 in Data Analytics by Zane
2,835 views
0 votes
1 answer
0 votes
1 answer

How can I use parallel so that it preserves the list of data frames

You can use pmap as follows: nc <- ...READ MORE

answered Apr 4, 2018 in Data Analytics by kappa3010
• 2,010 points
19 views
0 votes
1 answer

Join list of data.frames using map() call

You can use Reduce set.seed(24) r1 <- map(c(5, 10, 15), ...READ MORE

answered Apr 5, 2018 in Data Analytics by darklord
• 6,140 points
8 views
0 votes
1 answer
0 votes
1 answer

How to join two tables (tibbles) by *list* columns in R

You can use the hash from digest ...READ MORE

answered Apr 5, 2018 in Data Analytics by kappa3010
• 2,010 points
37 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.