With Guzzle v7, its class GuzzleHttp\Client
became annotated as @final
as it will be a real final
class in Guzzle
v8. Extending Guzzle clients to enrich them with custom functionality or to pass configuration (e.g. API credentials) is
now discouraged and static code analysis tools like PHPStan may report this as an error. Depending on how
GuzzleHttp\Client
is extended, migration may be cumbersome. I got your back, and I'll cover some common cases in this
blog post.
In a project, we heavily extended GuzzleHttp\Client
for all our cases as we wanted to make use of dependency injection
via the client's class names. Please see the example below how our clients were implemented.
We defined our class App\Client\GithubClient
that extends GuzzleHttp\Client
without any further logic:
// src/Client/GithubClient.php
namespace App\Client;
class GithubClient extends \GuzzleHttp\Client
{
}
The client App\Client\GithubClient
is injected into the service App\Service\GithubService
:
// src/Service/GithubService.php
namespace App\Service;
class GithubService
{
public function __construct(
private readonly GithubClient $client
) {
}
}
The client App\Client\GithubClient
is configured with the API key in the project's services.yaml
file:
# src/config/services.yaml
services:
App\Client\GithubClient
class: App\Client\GithubClient
arguments:
$config:
headers:
Authorization: 'token %app.github.api_key%'
This is the most simple way a Guzzle client may be configured. A client just takes some configuration (like the
Authorization
header from the example) above, and it's ready to use. The whole class can be replaced by a simple
GuzzleHttp\Client
instance, which is configured in our services.yaml
file:
services:
guzzle.client.github
class: GuzzleHttp\Client
arguments:
$config:
headers:
Authorization: 'token %app.github.api_key%'
This requires another change! Since we are using GuzzleHttp\Client
now, autowiring the correct Guzzle client via its class name does not work anymore.
Now, we have to configure our service to use the guzzle.client.github
service explicitly:
services:
App\Service\GithubService:
class: App\Service\GithubService
arguments:
$client: '@guzzle.client.github'
Finally, we have to adjust the constructor
of our service to expect an instance \GuzzleHttp\Client
as argument:
// src/Service/GithubService.php
namespace App\Service;
class GithubService
{
public function __construct(
private readonly \GuzzleHttp\Client $client
) {
}
}
This migration is pretty easy, the key changes are:
guzzle.client.github
GuzzleHttp\Client
, as our custom client App\Client\GithubClient
became obsoleteApp\Service\GithubService
is configured to use guzzle.client.github
explicitlyIn some cases, our clients are more complex and need custom logic besides the configuration. A common use-case is passing an API key via a query parameter in the request URL, for example as required by Google Maps.
See an example of how our GoogleMapsClient
was implemented before:
// src/Client/GoogleMapsClient.php
namespace App\Client;
class GoogleMapsClient extends \GuzzleHttp\Client
{
public function __construct(array $config, string $apiKey)
{
$handlerStack = \GuzzleHttp\HandlerStack::create($config['handler'] ?? null);
$config = array_merge($config, [
'base_uri' => rtrim($config['base_uri'] ?? '', '/') . '/',
'handler' => $handlerStack,
]);
$handlerStack->unshift(\GuzzleHttp\Middleware::mapRequest(static function (\Psr\Http\Message\RequestInterface $request) use ($apiKey) {
return $request->withUri(\GuzzleHttp\Psr7\Uri::withQueryValue($request->getUri(), 'key', $apiKey));
}));
parent::__construct($config);
}
}
In this example, we create a GuzzleHttp\HandlerStack
and add a middleware to it. The middleware is responsible for adding the key
query...
When implementing the fundamentals of my.typo3.org, an API based on Symfony was built to feed data to applications in the TYPO3 universe, e.g. the aforementioned my.typo3.org or the Certification Platform. This API must be rock-solid, thus it has a decent test coverage for each single piece gluing the application together. With development going further, the amount of tests increased, including API endpoints that get their data from a database. In out test scenarios, we use a sqlite database as this needs no additional setup.
At some point in the development process, executing tests became slower and slower as the amount of tests and the respective amount of fixtures increased:
Time: 10:06.275, Memory: 916.00 MB
OK (771 tests, 3120 assertions)
The full test run needs ~10 minutes and consumes over 900 MB of RAM. Of course the whole test suite doesn't have to re-run when changing a single controller, but some changes are low-level and trial & error by letting the CI do the job is not really feasible.
Disclaimer: I'm developing on a 2019 Dell XPS 15 7590 with hexacore CPU Intel i7-9750H, 32 GB RAM, an M.2 NMVe SSD, Ubuntu 20.10 and every project runs with ddev.
Scraping data from and pushing data back to the SSD shouldn't be that time-consuming, but I'm not that deep into Symfony and sqlite internals to properly explain what's going on here. Luckily, Symfony allows to store the database in RAM very easily by setting the database URL to sqlite:///:memory:
instead. However, the first run didn't go well:
ERRORS!
Tests: 771, Assertions: 1575, Errors: 39, Failures: 299.
All failures are caused by the exception Doctrine\DBAL\Exception\TableNotFoundException
, right after priming the database the tables are not available anymore after importing the fixtures. After some research I found out that Symfony keeps the database in RAM until its kernel gets shutdown, either on purpose or when a new kernel is created. This happened at three specific places:
A typical setUp()
looked like this:
protected function setUp(): void
{
parent::setUp();
$this->prime(); // calls static::bootKernel() as well
$this->importFixture('path/to/fixture.php');
static::bootKernel();
}
The issue has been identified, let's start fixing it. The kernel is now booted at first in the test's setUp()
method and the primer demands an already booted kernel. If the primer cannot find a kernel, a \LogicException
is thrown which reveals non-adopted test classes. The primer is a trait
being imported in the test classes extending \Symfony\Bundle\FrameworkBundle\Test\KernelTestCase
, checking for a booted kernel is straight forward:
trait DatabasePrimer
{
public function prime(): void
{
if (!self::$booted) {
throw new \LogicException('Could not find a booted kernel');
}
// ...
}
}
The fixtures are imported by Doctrine's EntityManager
, calling its persist()
and flush()
methods. This revealed another issue: the records imported to the database could not be found. The reason is that Doctrine maintains an identity...
In the recent release of TYPO3, namely 10.4.10 at the time of writing this blog post, a patch meant to improve the backend performance was merged: the introduction of SVG icon sprites. Unfortunately, this patch had unexpected consequences and lead to some new experiences.
tl;dr: This blog post is a mix of "how good things went wrong" in terms of TYPO3 icons and personal drama based on recent observations. You're still with me? Great, please read further.
As always when patches for open source projects are written, everybody does it with best intentions to scratch own itches and those of other people, the same happens for TYPO3. Some changes have a little impact, others include major refactorings. This patch introduces SVG icon sprites for an improved overall backend performance:
As written above each change has an impact, but sometimes the expected impact turns out to be wrong. The same thing happened with this very change. With an upgrade of the @typo3/icons package to version 2.0, some important restructurings were done which applies to the TYPO3 CMS as well, of course.
Previously, icons were stored in sysext/core/Resources/Public/T3Icons
, which is considered being private API. Yes, private, albeit the icons are stored in the Public/
directory, simply because those icons are publicly callable assets.
This is not a bad thing, as with TYPO3 7.5 an Icon API was introduced, which is in place for five years already at the time of writing this blog post.
An attentive reader noticed the word "previously" in the upper paragraph. With switching to version 2 of the icon repository, the location of these icons changed, which isn't a big deal as there is a stable API in place, right? Wrong. It turned out extension authors out there rely on the absolute icon paths, either because of lack of knowledge, "it's there" or mistrust in the APIs.
Altough some people allege it, this change was not meant to break existing extensions, otherwise it would have been marked as "breaking" and it would not have been backported to v10. On the other hand, relying on a specific, non-documented behavior makes every bugfix a breaking change.
Some honest words here: under regular circumstances I'd say "well yeah, you're out of luck". Unfortunately, I noticed that the general tone became pretty harsh lately especially on social media. People sometimes demand things in free software (please read the GPL, especially the "NO WARRANTY" part, thanks!), or, in even worse cases, attacking the whole team who works hard on making TYPO3 better step-by-step.
Whatever the reason is, such cases make decisions even harder as everything must be considered being "breaking" in one way or another. Be assured that nobody "hides" a breaking change somewhere as a drive-by change willingly.
Header photo by Harpal Singh on Unsplash.
]]>Back then, when each browser had its own set and understanding of "supporting" JavaScript features, one knight in shiny armor saved us maiden developers and allowed us to focus on our tasks: jQuery. There was no necessity to remember every browser quirk or buggy implementation, jQuery was there and covered us.
But sometimes we have to let things go. The TYPO3 Core minimizes the usage of jQuery in an ongoing process. For example the querySelector allows to select a specific element by CSS selectors, similar to Sizzle, the selector engine developed and used by jQuery.
But other parts of jQuery get replaced as well with modern, native APIs. During December's Review Friday the new AJAX API got merged, which implements the Fetch API under the hood. Fetch uses Promises under the hood which makes is very easy to chain success or error handlers to a request.
The module is located at TYPO3/CMS/Core/Ajax/AjaxRequest
and may be used with RequireJS. A very basic example looks like this:
require(['TYPO3/CMS/Core/Ajax/AjaxRequest'], function (AjaxRequest) {
new AjaxRequest('https://httpbin.org/json').get().then(
async function (response) {
const data = await response.resolve();
console.log(data);
}
);
});
But what happens here? We create an AjaxRequest
object that only takes the target URL as argument. After that we're free in "decorating" the request. In this case, we send the request as GET
and process the response. The response is of type AjaxResponse
which exposes the methods resolve()
and raw()
. In this example we use resolve()
, which checks whether the response contains a Content-Type
header that matches to any JSON content. If this assumption is true, a JSON object is returned, otherwise we get a plaintext response that could contain any string.
What's going on with these async
and await
keywords? Simplified speaking, this is a shortcut to handle Promises in a more comfortable way.
For more advanced usage, raw()
may be used instead which returns the original response object:
require(['TYPO3/CMS/Core/Ajax/AjaxRequest'], function (AjaxRequest) {
new AjaxRequest('https://httpbin.org/json').get().then(
async function (response) {
const response = response.raw();
if (raw.headers.get('Content-Type') !== 'application/json') {
console.warn('We didn\'t receive JSON, check your request.');
return;
}
console.log(await raw.json());
}
);
});
That's all nice, but how are errors handled? Errors may happen anytime, either if the client's network is down or if the requested endpoint is not available anymore. The original implementation of Fetch is a bit strange here: errors sent by the remote (e.g. HTTP status 500) are not handled as failure, but client errors are. Since this is not feasible and doesn't improve the developers experience, AjaxRequest
always throws a ResponseError
, which contains the original response:
require(['TYPO3/CMS/Core/Ajax/AjaxRequest'], function (AjaxRequest) {
new AjaxRequest('https://httpbin.org/status/500').get().then(
function (response) {
// Empty on purpose
}, function (error) {
console.error('Request failed because of error: ' + error.response.status + ' ' + error.response.statusText);
}
);
});
But the API does more: if you need to add query arguments to the URL, call withQueryArguments()
of the request object. The method...
In TYPO3's history there have been a lot of ups and downs, in any regard. This blog post tells you something about JavaScript in TYPO3: how was it back then, what's happening now and what will maybe happen.
In the past, the JavaScript implementations were some bunch of loose scripts that did some stuff. There was no clear separation of concerns and the scripts were loaded in a defined order and with some luck, everything worked well. The scripts had (sometimes hidden) dependencies to other scripts, so extending them or fixing bugs was always some kind of a challenge.
In some cases, the JavaScript calls were not triggered by proper event handling, but by direct calls injected by PHP, what makes the whole situtation worse and more error-prone. Unfortunately, this is still the case, currently.
Back then there were some JavaScript frameworks that were delivered with TYPO3: prototype, script.aculo.us and jQuery. The modules based on any of these frameworks were partialy a mess, as a module that was clearly written for one specific framework also used API of the other frameworks.
With ExtJS, the TYPO3 Core received some kind of modularization, but that came with a high price: It's very complicated to extend the functionality of ExtJS for beginners, plus at some point the support of the used ExtJS version ended. Since parts of the TYPO3 backend highly depended on ExtJS components (Page tree, workspaces, extension manager, recycler and some more), this was the worst case, since the used ExtJS version had some critical bugs in regard of Internet Explorer and an upgrade was not possible without rewriting everything.
With the start of TYPO3 7 the core team decided to streamline the JavaScript modules and to get rid of any framework that blocks further development. Since prototype and script.aculo.us seemed to be abandoned (and actually they are, prototype got its last release in September 2015, script.aculo.us in December 2010!), all JavaScript was rewritten to either use Vanilla JS or jQuery. That was also the time where RequireJS was introduced (also see this blog post). The goal of RequireJS is to have a proper dependency handling between JavaScript files to make sure every required module is loaded before the actual module. It's also possible to load another module "on-the-fly" when it's really required.
In the same run, most of the ExtJS components were rewritten to RequireJS and jQuery, as we decided to get rid of ExtJS as well, for obvious reasons. But getting rid of ExtJS should take some more time.
In TYPO3 v8, most of ExtJS got removed. The so-called "viewport" (the "frameset" of the backend with the top bar, module menu and content box) was rewritten to plain HTML and jQuery, which drastically reduced the processing time and the memory footprint of the backend. Only the page tree was left.
With TYPO3 8.4, TypeScript, a superset scripting language of JavaScript was introduced into the Core. But why? The main feature of...
]]>t3editor
.
On my journey migrating a lot of stuff to jQuery and RequireJS during TYPO3 CMS 7 development, Benni Mack asked me to migrate the extension t3editor
to jQuery as well. It sounded like an easy job - I've never been so wrong.
The extension was literally a hybrid of any existing JavaScript framework that was available back then when the extension was created: CodeMirror, prototype, script.aculo.us and ExtJS.
The most challenging part was migrating our custom plugins for code completion to RequireJS, because of clean dependency handling. It also took some effort to get the layout of the code completion functional again. After two months (and a tiny patch in codemirror itself), the migration was done and the patch was merged. But, of course, there were still some glitches that were fixed in follow-up patches. Finally, the extension only uses CodeMirror, jQuery and RequireJS.
But behold - if that's the end of the story I wouldn't put it into a blog post.
The journey continues. We noticed that CodeMirror itself is really old with version 2.x, while version 5.x was out there in the wild, so the adventure continues. Several efforts were taken, all of them failed at a certain point, the latest one was after DevDays '16 in Nuremberg. After that, it went silent around t3editor. After taking weeks of thinking about how to refactor the whole codebase, I've started with one large pitfall that broke my neck before: Port everything that uses CodeMirror to FormEngine before doing any major overhaul.
Two parts of the backend were affected by this change: The "Template" and the "Filelist" backend modules. The "Template" module had a custom, dedicated edit view for sys_template records. That view is dropped and EditDocumentController is now used for this purpose. That was the easy patch. The more challenging part was the file handling stuff, as the forms had a custom structure and literally any file handling operation was affected by this. After 12 patch sets only, that patch was merged as well. Bonus: Both patches removed some hooks that were barely used at all (probably only by t3editor itself).
And now begins the fun stuff. The first thing I did was copying the most recent CodeMirror release into the core. After that, I removed any custom code and FormEngine hacks and, well, it wasn't so bad at all. The main library was working and then I noticed: CodeMirror comes with a lot of so-called "modes" (syntax highlighting) and add-ons. I didn't want to include all available modes and add-ons, so I've implemented a configuration structure that allows to register additional modes and add-ons on extension basis. After struggling with the add-ons's dependencies (even with RequireJS, you have to load any dependency by yourself), everything was fine so far. One of the last and most challenging...
]]>$GLOBALS['TBE_MODULES_EXT']['xMOD_db_new_content_el']['addElClasses']
.
This approach is not deprecated yet, but discouraged within the TYPO3 Core, as non of the shipped extensions use this mechanism.
As exception of the rule, EXT:indexed_search
still uses this approach, in case EXT:compatibility7
is installed.
In this article we're going to migrate the PHP-based classes to plain TypoScript. As example extension we use tt_news 7.6.3.
In its ext_tables.php
you may find the following code:
$TBE_MODULES_EXT['xMOD_db_new_content_el']['addElClasses']['tx_ttnews_wizicon'] = TYPO3\CMS\Core\Utility\ExtensionManagementUtility::extPath($_EXTKEY).'pi/class.tx_ttnews_wizicon.php';
The registered class itself has a method called proc()
that basically modifies an array only:
function proc($wizardItems) {
// ...
$wizardItems['plugins_tx_ttnews_pi'] = array(
'icon'=>TYPO3\CMS\Core\Utility\ExtensionManagementUtility::extRelPath('tt_news').'pi/ce_wiz.gif',
'title'=>$LANG->getLLL('pi_title',$LL),
'description'=>$LANG->getLLL('pi_plus_wiz_description',$LL),
'params'=>'&defVals[tt_content][CType]=list&defVals[tt_content][list_type]=9'
);
return $wizardItems;
}
Migrating this to TypoScript is a rather easy task. For beginning, a new file called Configuration/PageTS/NewContentElementWizard.ts
is created within the extension directory:
mod.wizards {
newContentElement.wizardItems {
plugins {
elements {
plugins_tx_ttnews_pi {
icon = EXT:tt_news/pi/ce_wiz.gif
title = LLL:EXT:tt_news/Resources/Private/Language/locallang.xml:tt_news_title
description = LLL:EXT:tt_news/Resources/Private/Language/locallang.xml:tt_news_description
tt_content_defValues {
CType = list
list_type = 9
}
}
}
}
}
}
You probably noticed that the structure of the TypoScript is nearly the same as the PHP-based array, but there are a few minor differences:
EXT:
, TYPO3 resolves this automaticallyparams
key is replaced by tt_content_defValues
Very important is that the configuration is wrapped within this scaffold, otherwise the TS won't work:
mod.wizards {
newContentElement.wizardItems {
plugins {
elements {
// YOUR CODE
}
}
}
}
Now, the TypoScript file must be registered. This is done within ext_localconf.php
, NOT ext_tables.php
(everytime you do this a unicorn dies, now do the math 😉):
\TYPO3\CMS\Core\Utility\ExtensionManagementUtility::addPageTSConfig(
'<INCLUDE_TYPOSCRIPT: source="FILE:EXT:tt_news/Configuration/PageTS/NewContentElementWizard.ts">'
);
We're nearly done. The last thing we have to do is removing any evidence of the old PHP-based approach. Means, you should remove the registration based on $GLOBALS['TBE_MODULES_EXT']['xMOD_db_new_content_el']['addElClasses']
and clear the TYPO3 caches afterwards. If everything still works fine, it's sav to remove the wizicon PHP class.
ext_tables.php
file of a real extension that was originally developed for TYPO3 4.5.
In the old times, literally everything was placed in ext_tables.php: TCA, content element wizards, TypoScript includes and a lot more. In this series I'll guide you how to clean up that mess into a proper stucture. Today we handle the Table Configuration Array, or "TCA" for short.
As already teased, this is a real world extension. However, the extension name is anonymized. No further cleanups were made.
The TCA is probably split into two files: ext_tables.php
and a dedicated tca.php
.
The part in ext_tables.php:
$TCA["my_extension_table"] = Array (
"ctrl" => Array (
'title' => 'LLL:EXT:my_extension/locallang_db.xml:my_extension_table',
'label' => 'title',
'tstamp' => 'tstamp',
'crdate' => 'crdate',
'cruser_id' => 'cruser_id',
'versioningWS' => TRUE,
'origUid' => 't3_origuid',
"default_sortby" => "ORDER BY crdate DESC",
"delete" => "deleted",
"enablecolumns" => Array (
"disabled" => "hidden",
"starttime" => "starttime",
"endtime" => "endtime",
"fe_group" => "fe_group",
),
"dynamicConfigFile" => t3lib_extMgm::extPath($_EXTKEY)."tca.php",
"iconfile" => t3lib_extMgm::extRelPath($_EXTKEY)."icon_tx_myextension_table.gif",
"dividers2tabs" => 1,
"canNotCollapse" => 1,
),
"feInterface" => Array (
"fe_admin_fieldList" => "hidden, starttime, endtime, fe_group, title, content",
)
);
The part in tca.php looks like this:
$TCA["tx_myextension_table"] = Array (
"ctrl" => $TCA["tx_myextension_table"]["ctrl"],
"interface" => Array (
"showRecordFieldList" => "hidden,starttime,endtime,fe_group,title,content,info,version,edit_date,edit_user"
),
"feInterface" => $TCA["tx_myextension_table"]["feInterface"],
"columns" => Array (
"hidden" => Array (
"exclude" => 1,
"label" => "LLL:EXT:lang/locallang_general.xml:LGL.hidden",
"config" => Array (
"type" => "check",
"default" => "0"
)
),
"starttime" => Array (
"exclude" => 1,
"label" => "LLL:EXT:lang/locallang_general.xml:LGL.starttime",
"config" => Array (
"type" => "input",
"size" => "8",
"max" => "20",
"eval" => "date",
"default" => "0",
"checkbox" => "0"
)
),
"endtime" => Array (
"exclude" => 1,
"label" => "LLL:EXT:lang/locallang_general.xml:LGL.endtime",
"config" => Array (
"type" => "input",
"size" => "8",
"max" => "20",
"eval" => "date",
"checkbox" => "0",
"default" => "0",
"range" => Array (
"upper" => mktime(0,0,0,12,31,2020),
"lower" => mktime(0,0,0,date("m")-1,date("d"),date("Y"))
)
)
),
"fe_group" => Array (
"exclude" => 1,
"l10n_mode" => "mergeIfNotBlank",
"label" => "LLL:EXT:lang/locallang_general.xml:LGL.fe_group",
"config" => Array (
"type" => "select",
"size" => 20,
"maxitems" => 20,
"items" => Array (
Array("", 0),
Array("LLL:EXT:lang/locallang_general.xml:LGL.hide_at_login", -1),
Array("LLL:EXT:lang/locallang_general.xml:LGL.any_login", -2),
Array("LLL:EXT:lang/locallang_general.xml:LGL.usergroups", "--div--")
),
"foreign_table" => "fe_groups",
"foreign_table_where" => "ORDER BY title",
"itemListStyle" => "width:350px;",
"selectedListStyle" => "width:350px;",
)
),
"title" => Array (
"exclude" => 1,
"label" => "LLL:EXT:my_extension/locallang_db.xml:tx_myextension_table.title",
"config" => Array (
"type" => "input",
"size" => "50",
"max" => "250",
"eval" => "required,trim",
)
),
"content" => Array (
"exclude" => 1,
"label" => "LLL:EXT:my_extension/locallang_db.xml:tx_myextension_table.content",
"config" => Array (
"type" => "text",
"cols" => "30",
"rows" => "5",
"wizards" => Array(
"_PADDING" => 2,
"RTE" => Array(
"notNewRecords" => 1,
"RTEonly" => 1,
"type" => "script",
"title" => "Full screen Rich Text Editing|Formatteret redigering i hele vinduet",
"icon" => "wizard_rte2.gif",
"script" => "wizard_rte.php",
),
),
),
"defaultExtras" => "richtext[]:rte_transform[mode=ts]",...
]]>
In old days, AJAX handlers were registered in ext_tables.php
via \TYPO3\CMS\Core\Utility\ExtensionManagementUtility::registerAjaxHandler()
. Although this still works, you should refrain from this as this is deprecated since TYPO3 v8. As an alternative you should use routes. To achieve this, create the file Configuration/Backend/AjaxRoutes.php
in your extension and register your routes:
<?php
return [
'unique_identifier' => [
'path' => '/unique/identifier',
'target' => \FooBar\Baz\Controller\AjaxController::class . '::myAwesomeAction'
],
'do_something' => [
'path' => '/do/something',
'target' => \FooBar\Baz\Controller\AjaxController::class . '::helloWorldAction'
]
];
Please be aware that the identifier and the path must be unique to avoid naming collisions.
After changing this file, the caches must be cleared.
After that, the routes are callable via AJAX. In an AMD module, you may call your action like this way:
$.ajax({
url: TYPO3.settings.ajaxUrls['unique_identifier'],
method: 'GET',
dataType: 'html',
success: function(response) {
console.log(response);
}
});
In this example, we let the action return HTML code. As the default content type for AJAX actions with PSR-7 is application/json in the backend, I'll show you how to change the content type:
public function helloWorldAction(
\Psr\Http\Message\ServerRequestInterface $request,
\Psr\Http\Message\ResponseInterface $response
) {
$response->getBody()->write('<b>Hello <i>World</i></b>');
$response = $response->withHeader('Content-Type', 'text/html; charset=utf-8');
return $response;
}
In the frontend you still use eID. Those may also run with PSR-7 now, but this is not mandatory yet. The registration itself slightly changes, instead of passing a file you also pass a controller::action combination:
$GLOBALS['TYPO3_CONF_VARS']['FE']['eID_include']['fnord'] = \FooBar\Baz\Controller\FrontendAjaxController::class . '::fnordAction';
Thanks to the PSR-7 standard, the common "layout" of the action looks like the example from the backend:
public function fnordAction(
\Psr\Http\Message\ServerRequestInterface $request,
\Psr\Http\Message\ResponseInterface $response
) {
$response->getBody()->write('I\'m content fetched via AJAX.');
return $response;
}
In contrast to the backend, the default content type for eID remains text/html.
]]>modX/index.php
files for your backend modules. That's not nice, but there is help: PSR-7. This tutorial shows how to setup your backend modules according to PSR-7.
The PSR-7 standard describes the communication with HTTP messages.
You probably set up your backend modules this way in your ext_tables.php:
$extPath = \TYPO3\CMS\Core\Utility\ExtensionManagementUtility::extPath($_EXTKEY);
\TYPO3\CMS\Core\Utility\ExtensionManagementUtility::addModule(
'web', 'txmoduleM1', '', $extPath . 'mod1/', [
'access' => 'group,user',
'name' => 'web_txmoduleM1',
'labels' => [
'tabs_images' => [
'tab' => 'moduleicon.gif',
],
'll_ref' => 'LLL:EXT:huselpusel/mod1/locallang.xlf',
]
]
);
The hard-coded extension path looks not so nice and even triggers an entry in the deprecation log:
Registered "web_txmoduleM1" as a script-based module. Script-based modules are deprecated since TYPO3 CMS 7. Support will be removed with TYPO3 CMS 8, use the "routeTarget" option or dispatched modules instead.
To achieve this, the first step is creating a new controller, let's call it Classes/Controller/BackendModuleController.php
:
<?php
namespace Foobar\Huselpusel\Controller;
use TYPO3\CMS\Backend\Module\BaseScriptClass;
class BackendModuleController extends BaseScriptClass
{
}
The next step is inserting a public method that is called for dispatching the module. Let's call it mainAction
:
public function mainAction(
\Psr\Http\Message\ServerRequestInterface $request,
\Psr\Http\Message\ResponseInterface $response
) {
// Logic goes here...
}
mainAction
is the entry point to the backend module which takes the ServerRequestInterface
and ResponseInterface
of PSR-7 as parameters. This method handles the logic of your backend module. There is one important thing that behaves completely different: content is not echo
ed anymore (most likely by the method printContent()
), you must use the ResponseInterface
object.
You may get your GET and POST parameters with the request object:
$get = $request->getQueryParams();
$post = $request->getParsedBody();
To return the content, you may write into response's body and return the object:
$response->getBody()->write($this->doStuff());
return $response;
You may also define a HTTP status code or set the content type:
if ($FAIL) {
$response = $response->withStatus(500);
} else {
$response->getBody()->write($this->doStuff());
$response = $response->withHeader('Content-Type', 'text/html; charset=utf-8');
}
The content type for backend modules is by default text/html
, changing this is more important for AJAX calls based on PSR-7, but this will be explained in another blog post (hint hint).
So far, so good. Let's get back ext_tables.php where the module is registered. The fourth parameter of ExtensionManagementUtility::addModule()
requires the path to the module. Drop this, it must be an empty string now. As a replacement, adjust the configuration array by the new key routeTarget
:
[
'routeTarget' => \Foobar\Huselpusel\Controller\BackendModuleController::class . '::mainAction',
'access' => 'group,user',
'name' => 'web_txmoduleM1',
//...
]
After clearing the caches of TYPO3, the backend module is now called by the PSR-7 way and there should be no new entry in the deprecation log. You may copy the whole logic of your former modX/index.php
which one exception:
$SOBE = GeneralUtility::makeInstance('tx_huselpusel_module1');
$SOBE->main();
$SOBE->printContent();
Drop that code, you don't need it anymore as the module is already dispatched with your mainAction
.
You may move the rest of mod1
into their appropriate location:
moduleicon.gif =>...
]]>