top | item 42298836

(no title)

wholehog | 1 year ago

> Did they mess up when they did not pre-train or they followed the "steps" described in the original repo and tried to get a fair reproduction?

The Circuit Training repo was just going through an example. It is common for an open-source repo to describe simple examples for testing / validating your setup --- that does not mean this is how you should get optimal results in general. The confusion may stem from their statement that, in this example, they produced results that were comparable with the pre-trained results in the paper. This is clearly not a general repudiation of pre-training.

If Cheng et al. genuinely felt this was ambiguous, they should have reached out to the corresponding authors. If they ran into some part of the repo they felt they had to "reverse-engineer", they should have asked about that, too.

discuss

order

dogleg77|1 year ago

You must be kidding. They did reach out and documented their interactions, with names of engineers, dates, and all. Someone is ghosting someone here.

wholehog|1 year ago

"These major methodological differences unfortunately invalidate Cheng et al.’s comparisons with and conclusions about our method. If Cheng et al. had reached out to the corresponding authors of the Nature paper[8], we would have gladly helped them to correct these issues prior to publication[9].

[8] Prior to publication of Cheng et al., our last correspondence with any of its authors was in August of 2022 when we reached out to share our new contact information.

[9] In contrast, prior to publishing in Nature, we corresponded extensively with Andrew Kahng, senior author of Cheng et al. and of the prior state of the art (RePlAce), to ensure that we were using the appropriate settings for RePlAce."