-
-
Notifications
You must be signed in to change notification settings - Fork 224
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Internationalization (i18n) #202
Comments
As far as I knew Fluent represents some really advanced thinking about localization and also has a Rust implementation. So to me doing localization "the right way" in Askama would imply at least knowing what a Fluent integration would look like, and how much complexity it would take to do (and how hard it would be for our users). From what I've seen, gettext may look simpler but also gets complex in the end. |
Fluent actually has quite a simple rust API, most of the complexity is in the implementation: https://docs.rs/fluent-bundle/0.5.0/fluent_bundle/ Essentially, you create a You can also do The main impedance mismatch here is that fluent operates at runtime and not compile-time. I could imagine doing some compile-time verification -- checking that the input FTL files parse, maybe even type-checking the messages used in templates -- but you'll still need to actually load and execute the templates at runtime. Ideally there could be an I'd be happy to work on this in the next few weeks, although I'd need some help since I don't know askama very well :) |
Sounds great! Happy to answer questions and provide guidance/feedback on design and implementation. Having such a filter sounds good to me, the directory also make sense. We could take a similar approach to the templates dir, in that we could have a sane default ( Not sure yet what you'd need the proc macro for. |
Proc macro would let you run compile-time instead of runtime checks on the fluent files. Could be overkill, but I've always thought there's no such thing as too much safety in Rust :) |
My PR adds initial support for i18n using Fluent but the Fluent API currently has some issues. Once: are addressed, it'll be worth doing another pass. I don't think the i18n API will need to change on askama's end, but it should give users better performance and more support for locale fallback chains. |
Most of the relevant code's been moved over here to its own crate, which is almost ready to release. I just want to get askama and rocket working first. @djc, we can do the integration from either end: we could re-export baked_fluent from askama + add custom codegen to support it, or expose add a filter compatible with askama to baked_fluent. The only thing is that we'd need support for some sort of keyword arguments in filters if we wanted to do it that way. |
Do you have a preference? Either way is fine, really, though my intuitive preference is to have the filter living in askama, since it is an important feature to me. |
I'll open a PR to add support on this end. Only real issue I see is making sure people can find the docs; I can just duplicate them in the rustdoc over here though. |
@kazimuth what's the status on this? Have you got it working for your own application? |
Ah, some life stuff came up. Plus the job I was going to be using this for fell through, so I haven't been working on it much :/ baked_fluent is nearly ready for release, I just wanna get Rocket support working then i'll throw it on crates.io. All that's needed on this end is the parser + codegen changes from my previous PR, and a reexport of the |
@kazimuth did you have time to work on this? |
Baked_fluent is pretty much functional, i think i'll release it soon. I don't have the bandwidth for an askama integration, unfortunately; if someone's interested in writing that i can provide some tips. |
Ah...cool. Thanks for the info. |
@djc Do you have an overview of the current status of this?
I am right if I say that the localization feature has not been finished yet? |
You've found basically all there is to find -- that's the latest state as far as I'm aware. So moving forward, it'd probably be best to maybe fork baked_fluent (assuming that's allowed per its license) and then revive something like #237 to start integrating it with Askama? I'd be happy to mentor someone (you?) through doing the work! |
I would like to try it. I'll try getting started tomorrow. |
Don't hesitate to ask questions, here or on Matrix/Gitter! |
Hi, The way I would do it is I guess kinda hacky but I figured I can just add a field to all my structs like this:
and in the template call |
Have a look at #434. |
Okay, so I updated the code from the #434(all tests pass) and added the changes you suggested in this comment, the only thing left todo is to sperate it out into its own feature that can be included since I don't think everyone needs or wants fluent deps in their project. Once I did that I would open a new PR to get this feature done if that is still in scope/wanted? But that might take me a few days since I'm fairly new to the ecosystem. |
Yeah, would be great if you can turn that into a new PR, definitely still wanted! No hurry. |
After rebasing and refactoring the code, as well as addressing various review questions, CI is now all green again. |
So it never happened and i18n support is still not working? |
I'm using a low budget solution utilizing the template syntax itself, defining all translations in a base template others extend upon. Language depending positioning can be solved with a couple macros on top. While this is very basic and not pretty, it completely avoids lookups at runtime and should be enough for many use cases. {% let msg_greeting -%}
{% let msg_goodbye -%}
{% match lang %}
{% when "de" %}
{% let msg_greeting = "Hallo" %}
{% let msg_goodbye = "Aufwiedersehen" %}
{% when "es" %}
{% let msg_greeting = "¡Hola" %}
{% let msg_goodbye = "Adiós" %}
{% else %}
{% let msg_greeting = "Hello" %}
{% let msg_goodbye = "Bye" %}
{% endmatch %}
{%block content %}
{% call super() %}
{% endblock %} {% extends "i18n.html" %}
{% block content %}
<p>{{ msg_greeting }}</p>
<p>{{ msg_goodbye }}</p>
{% endblock %}
#[derive(Template)]
#[template(path = "test.html")]
struct Test<'a> {
lang: &'a str,
}``` |
@tsurai heres how i am doing it pub mod filters {
use rust_i18n::t;
pub fn localise(s: &str,lang:&str) -> ::askama::Result<String> {
let translation=t!(s,locale=lang);
Ok(translation.to_string())
}
} adn then in askana temaplte you can do this <!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>{{ "hello"|localise("mr") }}</title>
</head>
<body>
</body>
</html> |
I'm loving askama so far, but I'm trying to build my first real web app 'the right way', and support localization from the start. I've never worked on localization before, nor with a parser.
The three main approaches I've come across are:
2 is probably the most reasonable solution, and staying jinja-like would probably be beneficial. Would it make sense for askama to support some type gettext like function?
The text was updated successfully, but these errors were encountered: