Duplicate transactions from linked accounts

Man this is disappointing.

I spent all day yesterday uploading and categorizing everything and now I get up this morning to discover all of my transactions have been uploaded again, duplicated, over night.

Guys, please fix this. I really want it to work but so far it’s proving untrustworthy.

Thanks for any help you can provide,

Drew

@studio Thanks for reporting this. We’ve checked out your account and don’t see any obvious duplicates. We also don’t see any open support conversations for this issue.

Can you log in to Stessa and open a private convo via the blue circle at lower right? Attach a screenshot of the dupes you’re seeing and we’ll get back to you shortly. Thanks!

I’m having this same issue. I have repeatedly gone through and removed the duplicates but they keep coming back and new ones are showing up.

I have also found some duplicate transactions from an imported account. The issue is from an Appfolio linked account. The property management statements have overlapping time periods, and the Stessa system seems to handle this ok, except with there is a date change to the external account link, which happens every 6 months or so.
For example, I have the exact same Property Management fee on July 9, 2019 twice. All details in the two transactions are exactly the same except the “External Account” name field, which says " (PM name) - 2.28.19" in one, and “(PM name) - 7.18.19” in the other.

Suggestion: In the review and import function, perhaps you could raise a red flag for “potential duplicate” if all fields are the same except for the external account field, and allow users to see the other transaction for reference so we can delete one of them. Or maybe even if date, amount, and property (and category?) are the same. If the account is set to “Import and link” you could also force the transaction to trigger a “needs review” if the transaction appears to be potentially a duplicate. I would much rather review potential duplicates up front than try to filter and find them afterward.

Otherwise, if you create a database cleaning tool that will help us find potential duplicates, that would be helpful as well. But obvious preference is to avoid duplicates in the first place.

Thanks!

I am seeing the same issue. Does anybody have any explanation or workaround for this?