- ASCII was never intended for processing, just as an interface standard for data exchange (hence the name American Standard Code for Information Interchange)
- IBM never switched, it still uses EBCDIC within mainframes and ASCII for communication.
- IBM was a major proponent for ASCII, but not the sole force, and especially not international.
- ASCII soared in international use in the 1970s as being recommended by ISO and ECMA - especially the later being the driving force due the huge variability within Europe.
- (later) Mini computers and especially micro computers simply started out by using ASCII as well for processing as there is no reason for inventing a different (*1).
I’m trying to figure out when IBM switched to ASCII and when ASCII became world wide standard.
Well, IBM never switched.
EBCDIC is used due historic reasons within a /360 mainframe, but all outside connections (except for proprietary) are ASCII. In fact, as IBM was a driving force behind ASCII the /360 were prepared to switch for ASCII when it comes to BCD handling. Except, it never became useful and was dropped in the late 80s.
Moreover do IBM made ASCII world wide standard?
IBM was a major force for ASCII from the start, pushing for a standard. This included the use of preliminary standards or terminal system.. It was as well intended to be used for the /360 and any later machine, but standardisation took longer than expected, so they had to go forward with their own 8 bit code, EBCDIC, based on prior 6 bit and punch card codes.
While being a main player, IBM alone would not have been able to force it. Similar a US buying order could not do it - after all, it requested only compatibility with ASCII for information exchange, not operation in ASCII. A loophole big enough for anything in existence and to be invented to slip thru. All needed was some interface to accept ASCII data and reply using it.
ASCII only became an international standard when ECMA, the European Computer Manufacturers Association, recommended an ASCII based international variant in 1965, which became an international recommendation the in 1967 as ISO 646 and finally accepted in 1972. Here ISO 646-IRV defines a compatible base for all participating (latin) scripts. ASCII is at that point simply the US variant called ISO-646-US and relates to the 1968 version.
From the early 1970s on ISO-646 became the major encoding standard used for anything except mainframes.
Is it save to say, that IBM moved to ASCII starting from System / 370?
No. Beside the fact that /370 is more of renaming game than really a new series, the /360 was able to handle ASCII from the start. In fact, ASCII as a hardware supported feature got dropped from the line around the time it was renamed /390.
If so, is it safe to say, that IBM started use ASCII from 1970s?
No. IBM used ASCII already before it was a standard and continues to do so today.
And if so, is it safe to say, that system 370 had many clones, therefore ascii became popular world wide?
No, as /370s are still using EBCDIC as default character set. Unix on /370 and later is an exception. But ASCII can/is used on all communication (which was satisfied the mentioned law) with external systems.
In the IBM mainframe world (*2) two basic codesets were used:
- EBCDIC for everything within the system, that is CPU, memory, disks, tapes and other storage, as well as remote systems.
- ASCII for all communication to terminals and remote systems.
And this continues until today.
Modern (past 1970) EBCDIC became a full superset of ASCII. EBCDIC's structure reflects ASCII and is as well the reason why ISO 8859 contains two areas of control characters :)
*1 - Then again, some did, like Commodore (PET) or Apple (Apple II) that private codes may be helpful - except, these got confined to special areas and hidden beneath.
*2 - That is IBM and all hardware compatible systems like Hitachi, Fujitsu, Bull, Univac, RCA, Siemens, ...