You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Given that applications might want to be configurable to select minimum and/or maximum TLS versions, for security or interoperability reasons, it seems reasonable to be able to take an input string of some sort and turn it into a "TLS.Version". But sadly, that is a bit tricky, without hardcoding an explicit table in the application, since we only have Show and Bounded instances, but not Enum or Read:
-- | Versions known to TLS
--
-- SSL2 is just defined, but this version is and will not be supported.
data Version = SSL2 | SSL3 | TLS10 | TLS11 | TLS12 | TLS13 deriving (Show, Eq, Ord, Bounded)
Because the enum instance is missing it is not possible to write [TLS10..TLS13] or map show over all the values to construct a map of String -> Version. And without Read the library does not provide that either. I could as as a last resort have tried 'verOfNumandnumericalVer`, but those are internal non-exported functions.
So I think that something needs to change to make it possible read-in a TLS version from a configuration file or command-line switch, without hard-coding a list of known version names into applications.
Therefore, I think what's needed is:
A more standard output format for the names produced by the Show instance of TLS.Version (so explicit, rather than derived).
A Read instance for TLS.Version (that matches the Show name forms)
A function that returns a subset of the supported protocol versions with an optional ceiling and floor, but otherwise sorted in descending order.
{-# LANGUAGE MultiWayIf #-}
tlsVersionRange :: Maybe TLS.Version -> Maybe TLS.Version -> [TLS.Version]
tlsVersionRange vmax vmin =
filter byVersion [TLS13, TLS12, TLS11, TLS10, SSL3]
where
byVersion ver =
if | Just v <- vmax
, v < ver -> False
| Just v <- vmin
, v > ver -> False
| otherwise -> True
Alternatively, the client and server settings should take a min/max pair, and compute the range internally. I don't see a compelling use case for protocol version "holes". If the min/max are specified they filter the existing protocol list parameter, which would change to list all protocols, with the default ceiling initially "TLS12", until TLS13 support is sufficiently stable to be enabled by default.
The text was updated successfully, but these errors were encountered:
I would like to support the latter approach. Currently, we have command-line options for max versions (such ash --tls2). I believe that we should provide command-line options for min versions. E.g. (--min-tls2).
I agree that min/max versions a good interface option. FWIW, I convinced the OpenSSL team to adopt those, and did a large part of the implementation. Naturally, I am talking about the library configuration API, not the CLI of any particular command-line tool, but rather the CLIs and configuration files of other applications using the library.
Which means that users need to be able to take configuration settings from files and the like and map these to appropriate parameters, ideally without knowing all possible values in advance, that is by having the library interpret setting strings.
Once again with OpenSSL as an example, there is now a general purpose interface for applications to point the OpenSSL library at a configuration file "section" (OpenSSL uses configuration files with "[sectionname]" headings, with variables in each section, and in some cases multi-valued parameters being defined via section references from one section to another), and getting a client or server context configure itself based on the configuration file.
The OpenSSL library also supports having applications pass command-line switches with a selected prefix to OpenSSL, to achieve similar goals. Thus an application can pass all "--ssl-prefix-..." command-line arguments to the SSL library, and have it apply these to the SSL context, freeing the application from having to constantly adapt to new SSL features. This is not yet adopted by all applications, as support is comparatively new, but it is the way we (OpenSSL) intend to encourage applications to move.
I've not looked at the "settings" module issue or proposed implementation, so please consider that in the context of the above. These are all related application interface issues.
Given that applications might want to be configurable to select minimum and/or maximum TLS versions, for security or interoperability reasons, it seems reasonable to be able to take an input string of some sort and turn it into a "TLS.Version". But sadly, that is a bit tricky, without hardcoding an explicit table in the application, since we only have
Show
andBounded
instances, but notEnum
orRead
:Because the enum instance is missing it is not possible to write [TLS10..TLS13] or map
show
over all the values to construct a map ofString -> Version
. And withoutRead
the library does not provide that either. I could as as a last resort have tried 'verOfNumand
numericalVer`, but those are internal non-exported functions.So I think that something needs to change to make it possible read-in a TLS version from a configuration file or command-line switch, without hard-coding a list of known version names into applications.
Therefore, I think what's needed is:
Show
instance ofTLS.Version
(so explicit, rather than derived).Read
instance forTLS.Version
(that matches theShow
name forms)Alternatively, the client and server settings should take a min/max pair, and compute the range internally. I don't see a compelling use case for protocol version "holes". If the min/max are specified they filter the existing protocol list parameter, which would change to list all protocols, with the default ceiling initially "TLS12", until TLS13 support is sufficiently stable to be enabled by default.
The text was updated successfully, but these errors were encountered: