We're now capable of doing image rotation for thumbnails based on
EXIF orientation data. Also, thumbnails are tracked by filenames and
thus we can delete them from storage when we feel like it.
Conversation trees works pretty bad with the current layout, javascript
etc. So it's best if we separate it and work on it as a side-project. The
oldschool settings are currently being deprecated (or broken out like this).
I'll wait with removing User preferences for oldschool conversation tree,
since that might be reusable data. But I guess it will go in the near future.
This will simplify actions for future development and maintenance
since we can automate much more (such as auto-running show[Ajax|Page])
and handle errors of various kinds. Essentially the same kind of
improvements as Managed_DataObject gives us.
notice.id will give us even really old posts, which were
recently imported. For example if a remote instance had
problems and just managed to post here. Another solution
would be to have a 'notice.imported' field.
It seems it was only used to get a _single_ file attachment from
the posted notice, with no possibility to get multiple attachments.
If one fetches metadata about attachments for the notice, we have
enough data there to fulfill anyone's fetching dreams.
This makes it easier to disable, but remember that you must then
either enable and maintain queue daemons or disable queueing (and
handle whatever remaining queue items are stored in the database)!
We can't say we officially support PostgreSQL, unfortunately. There
are too many database calls with MySQL specific syntax. This would be
desirable for a 2.0 release, but too much work while maintaining 1.x.
The main difficulty is that we're using PEAR::DB which is aging. If
that's exchanged, maybe we could use PDO or something.
Read more at http://microformats.org/
Also, tooltip text on time representation for humans has been improved.
Unfortunately no standardised representation (like "RFC850") had 4-digit years.
The File object now stores width and height of files that can
supply this kind of information. Formats which we can not read
natively in PHP do not currently benefit from this. However an
event hook will be introduced later.
The CreateFileImageThumbnail event is renamed to:
CreateFileImageThumbnailSource to clarify that the hooks should not
generate their own thumbnails but only the source image. Also it now
accepts File objects, not MediaFile objects.
The thumbnail generation is documented in the source code. For
developers, call 'getThumbnail' on a File object and hope for the best.
Default thumbnail sizes have increased to be more appealing.
Avoiding collisions with date (shorter than before) and 4 character
random alphanumeric string. I bet someone could mass-upload files
and generate all combinations of aaaa-zzzz during the course of a
day, but then maybe that user should be disabled anyway :)
(filling the collision space will cause a never-ending loop).
The exception thrown from MediaFile will be caught and simply result in
no thumbnail at all right now. In the future we might use a catch-all
and have a "cannot generate preview"-icon or something.
VideoThumbnails requires php5-ffmpeg and php5-gd.
spl_autoload_register now calls the GNUsocial_class_autoload function
instead of us replacing the magic __autoload($cls). This means we can
queue up other autoload functions, such as the one now used for extlib
functions which exist directly in the 'extlib/' folder or have proper
namespacing (which our new Markdown class does).
At the same time we remove the "filecommand" setting, since we will
likely not have use of it thanks to PECL fileinfo.
Also the "supported" list for attachment mime types has changed
format, so we can keep track of at least some known file extensions.
Because if you have your own local/closed community, likely you
don't want random newcomers that drop in, spam and leave dead
accounts.
The Admin can of course always override this by setting the config
"inviteonly" to false either in the config.php or on the website.
Added the following FIXME:
How should a Twitter user get their Inbox filled with foreign tweets?
Every imported Twitter user has a profile in the Profile table, so we
could setup a Subscription entry for each of those, meaning they get
collected in the InboxNoticeStream... But this would mean a lot of
unnecessary entries and listings that generally just point to the
locked down Twitter service.
Let's figure out a good relation so we can connect any profile to any
imported foreign notice, so it shows up in the "all" feed.
Also cleaned up and made typing stricter for the stream, so only
profiles can be submitted. This reasonably also means we can create
"inbox" or "all" streams for foreign profiles as well using the same
stream handler (but of course only for messages we already know about).
To avoid looking up posts for a long time in a large notice database,
the lookback period for the inbox is no longer than the profile creation
date. (this matches the behaviour of Inbox)
Inbox class can probably be removed now.
Many of the microapps are pretty javascript dependant, but at least
we should allow users to get to the new notice field without allowing
javascript to run in the browser. :)
My reasoning: Minifying makes third party review harder. A visitor on
a GNU social site should have no problem reading, understanding and
modifying javascripts for their own liking. A minified script is much
more difficult to use, reuse, modify and share.
Free software is not minified.
Generally the Cron plugin will run if there's still execution time for
1 second since starting the Action processing. If you want to change
this (such as disabling, 0 seconds, or maybe running bigger chunks,
for like 4 seconds) you can do this, where 'n' is time in seconds.
addPlugin('Cron', array('secs_per_action', n));
Add 'rel_to_pageload'=>false to the array if you want to run the queue
for a certain amount of seconds _despite_ maybe already having run that
long in the previous parts of Action processing.
Perhaps you want to run the cron script remotely, using a machine capable
of background processing (or locally, to avoid running daemon processes),
simply do an HTTP GET request to the route /main/cron of your GNU social.
Setting secs_per_action to 0 in the plugin config will imply that you run
all your queue handling by calling /main/cron (which runs as long as it can).
/main/cron will output "0" if it has finished processing, "1" if it should
be called again to complete processing (because it ran out of time due to
PHP's max_execution_time INI setting).
The Cron plugin also runs events as close to hourly, daily and weekly
as you get, based on the opportunistic method of running whenever a user
visits the site. This means of course that the cron events should be as
fast as possible, not only to avoid delaying page load for users but
also to minimize the risk of running into PHP's max_execution_time. One
suggestion is to only use the events to add new queue items for later processing.
These events are called CronHourly, CronDaily, CronWeekly - however there
is no guarantee that all events will execute, so some kind of failsafe,
transaction-ish method must be implemented in the future.
To make the StatusNet::addPlugin() accept only arrays,
the lib/default.php had to be changed because all plugins
had 'null' as default value instead of an array.
If you're using XMPP by setting $config['xmpp'][*] then you should do:
addPlugin('Xmpp', $config['xmpp']);
because setting it directly in $config[''] won't do anything.
Also, default resource for XMPP is now 'gnusocial'. If you want something
more random, set it in your addPlugin config array.
Also removed the entirely unused saveGroups function.
Now avoiding multiGet and using listFind in Profile->getGroups()
so we don't have to deal with ArrayWrapper.
StatusNet chooses the first content element in an Atom feed, while
it should really choose the 'html' representation for its 'rendered'
and 'text' representation for the (text-only) 'content'.
GNU social will implement a better algorithm for retrieving Atom
feeds, but that is yet to be done. So to avoid having link-less posts
on remote nodes, we'll just do the old switch-a-roo.
Other Atom readers, such as Mozilla Firefox, has the reverse priority
(choosing the last of the content elements).
_flow_ reported on IRC that install.php had stopped working. This was
because default plugins had been put into two separate lists, and the
list with AuthCrypt was never loaded when performing an installation.
Core plugins cannot be disabled.
I also removed the Memcache autodetection thing since it should be
solved in a more elegant manner.