-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove REM/EM from specification? #218
Comments
Is there a different solution you have in mind to replace it with or a different relative unit that is platform-agnostic? They have provided a unit for a absolute values (px) and a unit for a relative values (rem) in the spec and have noted that a translation tool will need to provide appropriate conversions when preparing tokens for a specific platform. There is definitely value to having absolute and relative values available for dimensions so if there is another direction that is better, it'd be great look into it. |
IF they provided absolute units in specs, a fontSize: '16px' output, I'm all good. Please close request. |
REM and EM are appropriate values, while the pixel is not: Apple list its default font sizes on iOS. Body text, on default, refers to 17pt, which is one rem; depending on the user's preferences, the value for one rem or one em changes for accessibility. Android: To use pixel values in the case of design tokens is the wrong approach these days. REM and EM values, on the other hand, are quasi-the-platform-agnostic units that most design systems also relate to. |
A traditional px (or point) is the appropriate value, where REM/EM are derived values that are driven by the idea of a CSS Pixel. I believed 1pt was always equal to 1px (1/72") because iOS specifies typography in points, and that is how print works. I am confirmed in my belief when I set spec typography as size 17 (in Sketch, Figma, or Illustrator) as a developer handoff, I can screenshot the render of that design on my iOS device and expect it to overlay perfectly on my designs. Even though Illustrator says the typography is in points and documentation in Figma says it's in px, both line up perfectly. The problem occurs only on the web, with the idea of a CSS Pixel. This strange concept gives us math such as...
This ASSUMES a root font-size of 16px, and as we know, that is a variable that could be changed to anything in the web dev environment. Therefore, we should consider using the unit specification that matches the input defined by the designer in a design application as 'the standard', and leave the job to programmers to transform to any platform-specific values they like via Style Dictionary transforms.. |
As I consider this more, it seems the correct way to express typography size as defined in design programs such as Sketch, Figma, AdobeXD, and Adobe Illustrator is the unit of POINTS, or 1/72". In a traditional sense, the units of pixels/points were considered identical, but this is no longer true. Consider iOS Retina displays, and Android 1x, 2x, 5x. For the web, computer monitors will vary from 1/96 +/- 20px. Pixels do not have an intrinsic physical size, whereas points do. To complicate things even more, a CSS Pixel defines a pixel's physical size to 1/96, which I find curious because I have never typed 22.66 in an application to get the visual appearance of 17pt size font. Personally, it is difficult to believe that 22.66px on screen (and even considering a standard observer distance of X viewing distance) is representative of the physical size of 17pt as printed on paper. The ultimate pass/fail is when the coded implementation matches designs created in a design program. At normal user zoom. Design programs, as mentioned above, respect the 1/72" standard for both pixels and points. So... Rather than px, the most correct absolute value would be pt, taking care to say NOT A CSS PIXEL POINT but a standard Point. This is the stuff that burns my brain. ADDENDUM: A better way to say it is when it comes to typography, a 'px' and a 'pt' are indeed identical in size; both are 1/72" of an inch. To verify, type 'gh' at 100px size as Times New Roman in Figma, export as a PNG and import into Illustrator. Measure the height (from top of h to bottom of g) and you'll find it's close to 100px. Now, type 'gh' in Illustrator and set to 100pt, and once again same size. Next, do the same but from Code Pen. Screenshot and import into Illustrator. Once again, same size. I HOPE that puts to rest the idea that 1/96 is a 'thing'. It only manifests when using the pt unit on web. |
If I may, I would like to address the elephant in the room; Why do we have to force a unit value in the spec? Maybe I'm wrong here, but for the spec to truly be agnostic, the consumer of the spec dictates which units to use, right? |
@phun-ky It's 'nice' to know what the value represents before we transform it into the unit we need for the platform accurately, but you make an excellent point (all puns intended). I'm perfectly happy with the unitless value of But let's imagine if a Figma plugin did provide the option between different units on export. What functional benefit have we provided? Now, the Style Dictionary programmer must check for the existence of the unit flag and respond appropriately if it is set to rem, but no other unit (because only two units are proposed). And, to transform correctly, the programmer needs to reference a global font-size, which I suspect the specs do not provide? |
Well, I've experienced that you can have more than one unit in a component. What if the spec dictated a default unit, and then add option to have that overridden for special cases? Pick the unit that is most common or easiest to convert from? |
Wanted to chime in and note that Tokens Studio has a base font size feature, by default it's 16px, and all tokens written in rem would reference that. It can be changed to something else like 24px. https://docs.tokens.studio/tokens/settings#base-font-size I think that's an excellent feature. I agree with "the consumer of the spec dictates which units to use". If they want to write |
Please research rem vs. px in the context of accessibility. |
Working with Frontend since 98, and accessibility since validation and section 508 came around, you cannot exclusively have rem/em. This article sums it up https://www.joshwcomeau.com/css/surprising-truth-about-pixels-and-accessibility/
|
@phun-ky Please re-read my comment, I did not say rem should be the only unit. It is correct that you need multiple units, one of those must be rem, or a platform agnostic equivalent. |
@romainmenke I'm sure neither phun-ky nor myself were advocating the removal of rem from the web specifications or impacting WCAG accessibility. Everyone understands we need multiple units, especially if they have programmed Style Dictionary for multiple platforms. The question remains, should rem be embedded in W3C Design Tokens specification? In the future, I'd imagine nobody would manually type W3C Design Token JSON files but depend on Figma Plugins to create the files for them. Because Figma edit seats are somewhat expensive, developers typically do not have edit access to Figma libraries, so they are unable to run plugins. Therefore designers will be outputting JSON for the developers. I'm a 'Design Technologist', so I'm concerned about areas of responsibility as designers and engineers interact with each other in our Design System. Simply put, I would not put the responsibility of choice of REM vs PX in the hands of designers and certainly would not ask the designer to choose the base font-size and unit. Though we'd like a world where we could output JSON from Figma, run it through a generic install of Style Dictionary and get the result we need, that won't happen. We'll always need to customize Style Dictionary, so we should let it do what it does best. Keeping the W3C Design Token Specification simple helps SD do its job. Instead of adding an explicit unit to the spec, consider standardizing on the current unitless numeric value that indicates px (or traditional 1/72" point). I believe adding units puts the responsibility of the definition in the wrong hands and IMHO is an optimization that adds unnecessary complexity. |
The entire purpose of this specification is to create a standardized interface so that specific values can flow freely between tools (either design or translation tools). If it becomes a requirement for developers to preprocess and manually rewrite values so that they have the correct unit, then I don't see the point of this specification :) Designers should learn when to use relative and when to use absolute units imho. I see this as a problem that needs to be solved for design tokens to fulfill their intended purpose. |
It comes down to knowing the reason why tools such as Theo or Style Dictionary exist, the challenge of auto-exporting tokens from Figma via plugins, knowing the core mission of W3C Design Tokens, and an understanding of what a standard is able to accomplish.
Standardization of Design Tokens is great and solves a lot of problems. But, we need to understand what problems are being solved, why they are being solved, what the benefits are, what the scope of success is, and finally, what is outside of scope. |
If you think of design tokens as expressions of design intent, then there is a big difference between the choice of a relative unit like I believe it's important to be able to express both of those intents, which is why the DTCG spec allows both. We've borrowed CSS's unit names as we felt those are likely to be familiar to many folks working in and around design systems. I suppose we could have adopted Android's As for why have units at all, think of the In order for translation tools like StyleDictionary or Cobalt to "know" how to convert a token value into the appropriate platform-specific value and syntax, they need that info. As others have pointed out, it's not just the Web that has this concept. For example, Android has Sadly design tools like Figma do not currently have this concept, but I don't believe that's a reason to limit the spec's expressiveness (if anything, I'd hope it might nudge design tool makers to adding support for something akin to It's also worth noting that design tools (or any tools that might create or manipulate DTCG files) don't necessarily need to expose the "raw" DTCG values to their users. Just as an export tool like StyleDictionary might read a DTCG
While I'm sure many teams will operate in that way, we can't assume that designers are always the exclusive "owners" of design tokens, or that there is just a one-way flow of information from design tool to code. Tools like ZeroHeight, Supernova, Interplay and others already let you create and edit tokens, which can then be synched back to Figma as well as exported to code. Other tools to visualise, organise and manipulate tokens are emerging too, such as Token Studio's Flow tool. My hope is that the DTCG format will one day allow teams to pick and mix any combination of such tools and construct whatever design token flows they want. The "source of truth" for a team's design tokens thereofre doesn't have to be Figma. It could be a DTCG file in a git repo (which might be edited by hand or via some dedicated token editor tool), or some kind of token management tool that can import & export DTCG files. In that case, that is where are team can capture their absolute vs relative design intents - as long as we retain the When implementing designs, a developer would hopefully use the token names used in the design as their guide and reference the respective variables in their code which would be relative or absolute as needed. Long story short, I'm very much in favour of keeping the |
I wasn't 100% opinionated when I created this issue, but as the thread has evolved, I must admit I feel strongly against mix/match of units in the spec. As a programmer, I believe in the value of separation of responsibility. The W3C specification for tokens outputs key/values in a normalized format that we can reliably read. The transform tool (Theo/Style Dictionary) is responsible for translating the output to platform-specific code. Mixing those two concepts only adds to a list of if/else statements I'd much rather avoid. Furthermore, if the W3C spec does not include the base size, then I'm left guessing what 1rem means (is it 16px, or something else). I hope I've made my points clear, and thank everyone for participating. A few points that may be of help.
Which gets to the crux of the issue. A rem is an expression of another variable. On web, it's the HTML root font-size. Which, could be set to anything. Most use 16px, but some advocate for '62.5%' to make the units simpler in code. Therefore, REM is not a size in and of itself. Rather, it's closer to say an Android sp is 1/72", much like an iOS px unit. Both inherit from the concept of a point size from print.
If all is px, we know how to convert. Easy peasy.
Except, Figma IS the source of truth. With plugins, we export from Figma (or Sketch, or whatever), and we publish to repos of any sort. |
I have to disagree on this. It’s a two-way street, especially if you are using Tokens Studio with Github setup, both designers and developers can commit changes, as long as they are on the DS team and have proper permission. |
Tokens Studio is a plugin that exports tokens from Figma, but there are others. Personally speaking, Tokens Studio is slow, buggy, and an unintuitive solution – doing too much at once. However, love or hate Figma Tokens Studio plugin, it seems clear the mission of the W3C Design Tokens spec is to be agnostic to any one technology or any plugin. Certainly unopinionated on process and workflow. However, as Design System professionals, we must depend on design truth generated from the design library files as defined by brand designers (published to N platforms). Otherwise, we have a much bigger problem which is likely outside of the scope of this initiative. In other words, the solution must be linear for scalability, not bi-directional. |
🤔 I agree with @caoimghgin saying that it is out of scope, at least for this issue. A specification for design tokens (which is a part of something bigger, i.e. Design System), should not dictate where the source of truth lies. |
Came here to open the same discussion. And I think this is an important issue to solve and misleading for people approaching to design tokens. We know design tokens have core principles, and one of them is “platform agnostic”. Most examples in the spec use CSS-ready raw values, which is misleading since doesn't communicate the concept of “transforming the token for different platforms”. For example, Now, most tools use absolute values to do design (px), and that unit should be used as raw values for tokens, then converted by authors providing the additional info. Eg.
|
This discussion on the use of design tokens, their units, and the surrounding tooling has been incredibly insightful. Here are my thoughts: Main Points of the Discussion:
My Proposed System:I've suggested setting the root font size to On that note: This approach offers a pragmatic solution for web development. The ease of converting
Recently, Figma has introduced local Variables, their own take on design tokens. This addition can be seen as a game changer, as it brings the management of design tokens directly into one of the most popular design tools. It simplifies the workflow, reducing the need for external tools or plugins for creating and managing design tokens. However, one must consider whether this creates a new silo, potentially leading to inconsistencies if other tools or platforms are involved in the workflow. With that in mind, it is crucial to establish a clear "source of truth" and consider how this new feature integrates with the existing design and development processes. In conclusion, my system presents an optimized route for web applications. The broader challenge is finding a middle ground. Should design tokens be platform-specific or universally neutral? While my proposal is tailored for web contexts, the dilemma remains: should we adopt a universally accepted standard for all platforms or place conversion responsibilities squarely on the tools? |
What do we gain by having to explicitly include the unit, if the goal is to be platform/tool agnostic?
Please also consider variable units https://drafts.csswg.org/css-variables-2/#variable-units. Maybe my org wants to use a custom As it stands, requiring "px" or "rem" prevents me from being able to use maths like
Preventing me from doing that might be the feature, not a bug, but without understanding why I need to be explicit with a value, it's frustrating. |
That's why tokens need to be transformed based on the destination platforms supported. Any kind of unit should be banned because they are always platform specific and outside the scope of the spec. And sadly, I would never follow a spec that imposes token units I don't need. If don't build for the web, I don't care about rem, xh, ch, etc... |
To follow up on previous posts, I believe |
Sorry for pinging an old thread, but I’ve just put up a proposal where’d I’d suggest we NOT make this change: #244. The reasons are outlined there, but TL;DR, after reading this thread:
This is NOT an official decision yet! This is only a way to source feedback, and push this forward. Any/all feedback would be welcome. Thanks all for discussing this change 🙏
|
I've noticed font size values are spec'd as REM for W3C. Design token values should be platform agnostic. Native apps such as Android and iOS have no such concept, and the declaration of REM must point to another base value for rem to be meaningful.
The text was updated successfully, but these errors were encountered: