Finally got around to moving from MySQL to PostgresSQL

Gave up trying to migrate the database contents, and just did an OPML export and import.

Went from 17.12 to 19.2, and am very happy.

Thanks to fox and all contributors for a lovely piece of software (-:

And what did you have to lose? Starred articles?

Thanks to fox and all contributors for a lovely piece of software

And it gains in value day after day as the awareness grows of the darker sides of social networks and other ‘walled gardens’. But fox is an acquired taste, of course.

years of searchable history, the old one will remain as an archive

Like olives, or avocados ?

(-:

maybe there’s a use case for a better migration tool which would do what import_export does but on entire dataset instead of only starred articles

it would likely have to be command line only because it’s way easier to implement. dump everything into series of XML files instead of one, etc.

e: maybe using JSON instead of XML

Interesting how others use RSS. I treat news as ephemeral. So I use tt-rss as Twitter minus algorithms and ads. I pool together hundreds of feeds – more that I’d ever be able to read – filter out interesting articles, and purge everything every three days. Thus, to me the OPML is more valuable than the data base.

Apparently, you place more value the content of your feeds.

@fox
I think that would be a fscking fantastic tool (-:

@mamil
most of the feeds I have are technical blogs (rather than “news”) and there have been several occasions where I remembered that I’d seen something in a ttrss feed, searched ttrss and found it. Over time some sites have gone offline, but in ttrss I still have that content (-:

this shouldn’t be hard to write since most of the code can be reused from the plugin, i’ll make a note to take a look at this

alright so if anyone is feeling brave enough, help test this thing:

https://git.tt-rss.org/fox/ttrss-data-migration

i suggest using a test user, at least for your first import.

since this tool completely supersedes import_export plugin iit should be considered deprecated now. the only thing missing currently is a knob to export only starred/archived articles to replicate limited functionality of the stock plugin.

most of the code is copy-pasted from there anyway so there’s not much sense in keeping both.

I am currently trying to switch from MySQL to PostreSQL.I exported the opml file with the settings, but realized starred articles were not migrated.
I found this post, followed the instructions, but I can not run the command to create the file.

php ./update.php --help does not show anything related to data_migration plugin, which I uploaded into plugins.local and added to config.php:

[rss_old]$ php ./update.php --help
Tiny Tiny RSS data update script.

Options:
  --feeds              - update feeds
  --daemon             - start single-process update daemon
  --task N             - create lockfile using this task id
  --cleanup-tags       - perform tags table maintenance
  --quiet              - don't output messages to stdout
  --log FILE           - log messages to FILE
  --log-level N        - log verbosity level
  --indexes            - recreate missing schema indexes
  --update-schema      - update database schema
  --gen-search-idx     - generate basic PostgreSQL fulltext search index
  --convert-filters    - convert type1 filters to type2
  --send-digests       - send pending email digests
  --force-update       - force update of all feeds
  --list-plugins       - list all available plugins
  --debug-feed N       - perform debug update of feed N
  --force-refetch      - debug update: force refetch feed data
  --force-rehash       - debug update: force rehash articles
  --help               - show this help
Plugin options:
[ rss_old]$

I am sure missing something…
Glad to do more testing, or fixing my environment…

P.S.: latest git, PHP 7, Mysql for the old version

this means plugin is not loaded.

either you did something wrong with config.php or (more likely) you cloned it into an incorrectly named directory under plugins.local, see README.

Ok, my bad. Directory was named data-migration and not data_migration…
I was able to run the command, but where is the file saved?
That is not in the readme :face_with_hand_over_mouth:

edit: the file is saved in the root folder… Will try to import and will let you know

current directory unless you pass a full path. i thought this was obvious. :slight_smile:

that was my first guess too, but maybe I was faster than the time it took the script to create the file…

I exported only starred articles and it seems to be working, except:

[13:35:50/1035358] imported 316 (out of 369) articles, created 0 feeds.

is there any chance to know why 53 were discarded?
Maybe starred articles from feeds that I unsubscribed in the meanwhile?

Is there a way to run it in “verbose” mode or have it output to a file?

Let me know if I should run some other tests

nope, there isn’t any verbose mode. maybe i should add some warnings.

import requires valid feed url and feed title for articles, this skips articles which were originally in the archived feed, since they don’t have either. do you have 53 archived starred articles? :slight_smile:

if this is the case, i’ll make a note to take a look at proper importing of stuff back into archived.

no, I did have 369 starred articles and no archived article at all

rss_old

Unfortunately, the number is too great to compare both to find out which one is missing.

If I rerun the import the result is:

imported 0 (out of 369) articles, created 0 feeds.

i dunno then. you can PM me your exported file, i’ll take a look. it would be easier to add relevant warnings too, i guess.

I noticed something odd. When I migrated the feeds in the new installation, all the articles were refetched from the feeds and marked as unread.
There are few articles that I am sure I starred in the old installation and that in the new are not marked as starred.
I have to dig deeper, but might it be that the import script excludes the articles that are in the database (not purged) or the articles that are unread?

edit: I did an export and import of the full articles. The result is:
imported 8625 (out of 14240) articles, created 0 feeds.
and I already had about 6.000 unread articles so I believe the script somehow strips content it’s already in the target database, which makes sense for “standard” articles, but not for “starred”.
The perfect behavior would be to mark as starred unread articles it finds.
Substandard would be to duplicate articles, keeping starred and marked.
If my guess is right, right now no harm is done, as I will eventually re-read the articles and mark them as starred if they are still interesting to me.

the idea is to import into an empty database. if you update things in parallel on the same feeds and stuff i’m not guaranteeing the results.

i think it overwrites any articles already in the database (by guid).

It makes perfect sense.
The issue is that it took me time to figure out how to migrate starred articles and in the meantime the cron this his job…

the fact that some articles are not imported makes me think it skips them, but that’s just my feeling.

i ran your export which you sent me via PM on a fresh docker image, blank new user, here’s the results:

/var/www/html/tt-rss # sudo -u app php ./update.php --data_import dario.zip --data_user test
[09:27:24/130] Lock: update.lock
[09:27:24/130] importing articles of user test from dario.zip...
[09:27:24/130] processing 00000000.json
[09:27:39/130] imported 369 (out of 369) articles, created 56 feeds.

if i run it again, it shows 0 out of 369 which means that yeah, existing stuff is skipped, not overwritten.

it probably makes more sense to overwrite data on import regardless of it already existing in the database to set correct flags (i.e. starred).

i also ran my own import while i was at it, which correctly imported my old starred archived articles, so it looks like i was wrong and this is already handled correctly.

e: i think ideally updater should overwrite existing articles while notifying the user, i.e. show something like “X articles added, Y updated, Z skipped” at the end. if article is skipped it should print a warning. i’ll make a note to implement this when i have some time to kill.

e2: replacing existing articles could be an option like --data_replace etc