Comment by likeafox on 23/05/2018 at 19:53 UTC

7 upvotes, 1 direct replies (showing 1)

View submission: New and improved post requirements

Hey folks - thanks for the UI improvement for long lists of domains. That one was huge for us.

The thing we'd really like to do is enforce both a whitelist system *and* domain blacklist system using post requirements. The reasoning is that for non-whitelist domains, we want to let a user know that a domain has not been reviewed yet, and may be eligible for consideration. Meanwhile, we have various groups of reviewed and rejected domains that fall into one of several categories such as:

It would be very important for us to make the user understand *why* a submission is being rejected, along with context dependent information on how they should proceed. Currently auto-moderator does a very nice job of this overall, but pushing our tailored removal reasons to the pre-submit form would be a much better user experience.

--------------------------------------------------------------------------------

For the love of god: can we please get pre-submit duplicate link detection working? As far as I know this still isn't implemented at parity with the r2 / Old Site submit form. We also have to do a ton of custom and manual work for duplicate detection - checking the canonical URL rather than just checking the unique URL string would be of YUGE benefit to us.

--------------------------------------------------------------------------------

The pre-submit validation tools are an excellent idea, and we hope to see that product continue to develop, and be used in conjunction with auto-moderator for a long time to come.

Replies

Comment by Tetizeraz at 24/05/2018 at 00:12 UTC*

3 upvotes, 0 direct replies

For the love of god: can we please get pre-submit duplicate link detection working? As far as I know this still isn't implemented at parity with the r2 / Old Site submit form. We also have to do a ton of custom and manual work for duplicate detection - checking the canonical URL rather than just checking the unique URL string would be of YUGE benefit to us.

I remember reading some time ago that Firefox would remove the information in the url regarding the source of a url, eg. .typeform.com/**?ref=producthunt**, where the bold part is removed. It would be nice if Reddit did something similar, that would help with the duplicate links.