Podcasting since 2005! Listen to Latest SolderSmoke

Monday, September 13, 2021

So Where DID the LSB/USB Convention Come From?

-- Bottom line:   I still don't know why ham radio adopted as a convention LSB below 10 MHz and USB above 10 MHz.  There are several theories. but so far there is no convincing explanation in favor of any one of them. And almost all of the people involved are probably Silent Keys by now; this makes it more difficult to gather first-hand information. 

-- I'm not even sure when the convention began to be observed in ham radio. Many of the early SSB books and articles make no mention of it. We don't see it in early ARRL Handbooks. The first mention of it that I found was in the 1965 issue of the ARRL's "Single Sideband for the Radio Amateur" page 8. This article claims that adding a provision for selectable sidebands would "add appreciably to the cost of the equipment. " It went on to say that,  "For this and other reasons there has been a species of standardization on the particular sideband used in the various amateur bands. Nearly all operations in the 3.5 and 7 Mc. phone sub-allocation is on lower sideband, while the upper sideband is used on 14, 21, and 28 Mc."  

-- We know that the informal convention was being followed as early as 1958.  Jim N2EY reports that in 1958, the manual for the Central Electronics 20A shows that LSB was the "sideband most commonly used" on 75, with USB preferred on 20:

-- Some cite a 1959 ITU recommendation on commercial multiplexed radiotelephony as the reason for the convention.  But I don't think this obscure and long-ago ruling explains the convention.  If this were the case, we'd  see follow-up FCC regulation, and at least some discussion of the ITU recommendation in the amateur radio literature.  But we see none of this.  And, as noted above, by 1958 hams were ALREADY -- on their own -- opting to use LSB on 75 and USB on 20.   The 1965 ARRL SSB book refers not to some hard-and-fast rule, but rather to  "a species of standardization" on LSB and SSB.  That ARRL book said nothing of the 1959 ITU recommendation. 

-- There is a widely held belief that this practice originated in the design of a rig that had a 5.2 MHz VFO and a 9 MHz filter.  According to this theory such a rig -- due to sideband inversion -- would produce LSB on 75 meters and USB on 20.  But, as we have demonstrated, this doesn't work, so this theory has to be discounted. 

-- Early SSB activity seems to have been concentrated on 75 meters, and there was a competition for space with AM stations.   SSB operators appear to have used the very upper band edge as their gathering spot.  Using LSB allowed them to operate very close to the upper band edge -- a lot closer than AM stations could go.  This may explain why LSB became the preferred SSB mode on 75.  But how do we explain USB on 20 and above?  That remains a mystery. 

-- It is important to remember that in the early days of SSB, for most hams there were only two important phone bands: 75 meters and 20 meters.  40 meters was CW only until 1952, and even after that was crowded with shortwave broadcast stations.   So a design that allowed for both 75 and 20 was twice as good as a monoband design. 

-- Early on there were designs and parts for phasing rigs.  You could take that ARC-5 VFO at 5 MHz, build a phasing generator around it, and then mix it with a 9 MHz to get on either band.  But with just a simple switch, this kind of rig could operate on USB or LSB on either band.  So the early popularity of this kid of rig does not explain the convention. 

-- There were a lot of surplus 5 MHz ARC-5 VFOs available. There were also FT-243 and FT-241 surplus crystals at both 5 MHz and 9 MHz that could be made into filters.  Later in the 1950s, 9 MHz commercial crystal filters became available.  If you used a 9 MHz filter with a 5 MHz VFO, there would be no sideband inversion in your rig.  If the SSB generator was putting out LSB on 9 MHz, you'd be on LSB on both bands.  So if there was a desire to have LSB on 75, why not just also have LSB on 20? 

-- But if you built a 5.2 MHz filter and a 9 MHz VFO,  you could have LSB on 75 and USB on 20 without having to shift the carrier oscillator frequency.  This would save you the trouble and expense of moving the carrier oscillator/BFO to the other side of the passband.  This desire to economize and simplify may explain why we ended up with LSB on 75 and USB on 20.  But this still begs the question: Why the desire for USB on 20?  

-- Both the manufacturers and the hams wanted there to be sideband standardization.  With monoband rigs, the manufacturers would be able to cut costs by building for only one sideband.  Hams also wanted to cut costs, and they did not want to have to figure out which sideband a station was on when trying to tune him in. 

-- By 1962-1963  Swan and Heathkit were selling mono-band SSB transceivers that used the "conventional" sidebands:  The rigs for 75 and 40 meters were on LSB while the 20 meter rigs were on USB.  There were no provisions for switching to the other sideband. This seems to have reinforced the practice of observing the convention.   (Heath later added sideband switching to the HW monobanders -- in view of the growing observation of the convention, they may have been better off sticking with their original design. Does anyone know why they did this?)  But again, why USB on 20 and above? 

--  In 1963, Swan, by then in Oceanside California, came out with the Swan 240.   Swan used a filter centered at 5174.5 kc. The VFO ran from 8953 kc to 9193 kc on 75 and 20.  The VFO ran from 12222 to 12493 on 40.  This gave the buyer 75 and 40 on LSB, and 20 USB with only one carrier oscillator frequency. (Swan offered a mod that allowed hams to install an additional, switchable carrier oscillator frequency.  I luckily acquired one such modified rig.)  But again, there is an explanation for LSB on 75, but why USB on 20 and above?

This is an important part of ham radio history.   There should be a clear answer.  We need to find it.   If anyone has any good info on this, please let me know.  

12 comments:

  1. 1. From my reading the widespread development of SSB has alternated between filter and phasing methods something like this: 1. Filter method (with LC filters at audio), 2. Filter method with crystal filters at a low IF (eg 450 kHz), 3. Phasing method by amateurs who couldn't afford crystal filters, 4. Filter method as more of us could afford crystal filters or you could buy them pre-made. 5. Renewed interest in phasing and third method, then SDR techniques.

    Phasing designs could do both sidebands cheaply. So the story as to LSB vs USB starts off at stage 1 or stage 2 above. The 9 MHz exciter/5 MHz VFO is very much a 'Johnny come lately' technique popular when SSB was being adopted en masse and operating habits were entrenched.

    This July 1949 article describes a configuration that derives SSB from audio with several stages to mix up to 13 kHz, then 440 kHz then 7 MHz. https://www.armag.vk6uu.id.au/1949-july-AR.html That article is derived from QST Jan 1948. It's easier to generate USB than LSB at a near-audio IF as the inductance requirements are lesser. The signal is still USB at 440 kHz. But a 7.6 MHz local oscillator is used so the result is LSB on 7 MHz. It's not explained why they subtracted rather than use addition with a VFO at 6.7 MHz. But if you are using a transmitter with a low pass filter your output will be cleaner if you subtracted. Receiver birdies may also be less if you were to use the subtraction approach in reverse on receive (harmonics of the VFO might sweep by faster).

    More in Part 2.

    ReplyDelete
  2. 2. The 1961 RSGB Handbook has a lot on SSB. It starts off with phasing SSB as it was developed by amateurs. The filter method was described as the 'classic' approach developed earlier. But it was not the same filter method that we know today.

    Page 306 describes the 'low frequency' method of generating SSB with a block diagram. It's similar to the July '49 article mentioned before but the frequencies are a little different. It uses 10 kHz as the first oscillator with the upper sideband at 10 - 13 kHz filtered off. That goes to a second balanced modulator with the oscillator at 490 kHz. The addition is again filtered off, giving a 500 kHz USB signal. Feeding into a balanced modulator and VFO oscillator gives a 3750 kHz USB signal (and a LSB signal at 2750 kHz). Only the USB signal is desired so is filtered and fed to the final amplifier etc.

    This is quite complicated with all those balanced modulators/mixers. The next stage is to get rid of all the low frequency stages and do the SSB generation at around 450 kHz with filters made from war surplus crystals.

    The simple filters used 2 crystals with a symmetrical bandpass. You could have the carrier crystal either above or below the bandpass, and so get USB or LSB output.

    This was in the valve days and local oscillators were quite high power. It was apparently difficult to suppress the carrier fully in just the balanced modulator stage alone. So you needed the crystal filter to help.

    Another type of crystal filter used 3 crystals with a steeper bandpass on the lower side. This is shown in the book. Provided you had the carrier crystal below the bandpass you could get more carrier attenuation as it was further down the steep side. This would generate a USB signal.

    Page 309 uses this approach to describe a filter type exciter that worked on 3.5, 14, 21 & 28 MHz (no 7 MHz). You would first get it going on 3.5 MHz. A 4.2 MHz VFO was specified, giving an LSB output. The other mixer product at about 4.7 MHz would be partly suppressed by the low pass filter in the final amplifier.

    What about the higher HF bands? 7 MHz was only 100 kHz wide in the UK and very crowded so was not considered in this design. A heterodyne upconverter was used for 14, 21 & 28 MHz (the book coming out not long after a solar high).

    Again there was subtraction. The 3.5 MHz LSB was subtracted from the heterodyne frequencies, inverting the sideband and giving USB outputs on the higher bands. It is possible that crystals much above 10 MHz were not commonly available then (especially if war surplus). To obtain the frequencies needed without unwanted spurii lower frequencies were operated in 3rd overtone mode. Hence a 6 MHz crystal in the oscillator circuit would give 18 MHz. When the 4 MHz LSB was subtracted from that you'd get 14 MHz USB. Similar for 21 & 28 MHz with 8.3 and 10.9 MHz crystals respectively. Again there's the advantage of the unwanted mixing product being higher than the desired frequency so it could be suppressed in a low pass filter. Frequencies commonly used during the war might also have had a bearing on the choosing a mixing scheme that used available surplus crystals.

    To sum up this is possibly a story of path dependence. It is slightly easier to generate a good USB signal than LSB. This has been the case whether it was at audio or at 450 kHz. If you wanted to heterodyne up to an amateur band then if you had the VFO higher then you will invert the sideband, as happened for 3.5 MHz. If you wanted to heterodyne up again (since going from 450 kHz to 14 MHz in one go is a too big a step given likely images) then your mixing scheme chosen might have inverted again. Which brought us back to USB.

    My theory anyway.

    ReplyDelete
  3. Thanks Peter. But I just don't think there were many rigs like the complicated low-frequency filter jobs you describe. Can you point to any commercial rig or homebrew design that could have set this convention in motion?

    Also, there lots of rigs that were putting out LSB on 75 meters. As you point out, if it was a phasing rig you could have had LSB or USB on 75 or 20 (so that can't be the origin of the convention). If you had a filter rig with a 5 MHz VFO and a 9 MHz filter, unless you switched the carrier oscillator to the other side of the filter, you'd have the same sideband on both 75 and 20 (no convention there). And if you had a Mythbuster-like rig with the filter at 5.2 and the VFO at 9, you would get LSB on 75 and USB on 20. That is how the Swan 240 worked. But this begs the question: Why did they want LSB on 75 but USB on 20? What is your theory? 73 Bill

    ReplyDelete
  4. A very interesting subject Bill and I think it's going to run for a long time.
    Anyway anyone on 17m?
    DE HS0ZLQ /G0MIH.

    ReplyDelete
  5. Thanks Bill. There's two other things to consider that might lead one to answers:

    1. Crystal filters. Often these just had one crystal and were intended for sharp CW reception. Their bandpass is very asymmetrical with peaks and notches a kHz or so away from each other. This 1950 constructional article describes that with the then undeveloped filter techniques you could only get one sideband to work. https://www.armag.vk6uu.id.au/1950-june-AR.html

    2. The 1961 General Electric Sideband Book has a chart showing the perils of various mixing schemes. This first appeared in Nov-Dec 1956. This assumed you generated the SSB at 450 kHz before mixing it up to an amateur band at either 1.8 or 3.8 MHz (latter scheme most common for 75m). It then takes 3.8 - 4 MHz as a base from which you'd mix to to get higher HF bands. In my skim I didn't notice mention of sideband inversion as the article's main purpose was to demonstrate which mixing combinations were good and which were prone to spurii problems. But there was this line:

    "These two examples both illustrate the desirable feature of having both mixer input signals higher in frequency than the output signal."

    The reason why this is desirable is that harmonics of the input signals are out of the way and less likely to cause spurious products. The use of balanced mixers was less common then too, causing even more stuff to appear on their outputs, making wise input frequency selection critical.

    However subtraction also has trade-offs. Eg if you wanted 4 MHz by using a 12 MHz VFO with a 8 MHz SSB generator then the 12 MHz VFO may be less stable than a VFO at say 5 MHz. But if you could get the VFO to be acceptably stable at all frequencies below 15 MHz then you could band switch it to cover 3.5, 7, 14 & 21 MHz. And 28 MHz if you tolerated a bit more drift with the VFO at 20 MHz. The SSB at 8 MHz would just need to be USB as its subtraction for 3.5 and 7 MHz would automatically give an LSB output.

    Another scheme is to have just one of the two input frequencies that present to the mixer higher than the output. It might not be perfect but there's fewer combinations of troublesome harmonics than if you were simply adding frequencies. That's done with an example mentioned earlier where you were generating 3.8 MHz LSB from a 500 kHz USB signal by subtracting from a 4.3 MHz VFO. In this case the VFO is still a sufficiently low frequency to be stable.

    Higher HF bands were covered by another converter. Again you might use frequency subtraction to get on 14 MHz, like the RSGB design (which included a circuit) did. If you did that you would have inverted the sideband, changing LSB to USB. Interestingly the 3.5 to 14 MHz scheme in the GE book only has an addition option with a 10.3 MHz crystal. This would not have flipped the sideband.

    More drilling down would be desirable to get to the bottom of this. Reading the articles, it seems like the 'professional amateurs' always used filters (first audio then crystal) to generate SSB. Whereas 'amateur amateurs' who couldn't afford multiple crystals on precision frequencies went for phasing. That was cheaper to build, harder to understand and sometimes gave poor opposite sideband rejection with the sloppy component tolerances and drifty parts. The 'professional amateurs' were likely in this game first and set the standards, including on the choice of sidebands that for reasons we haven't yet fully established.

    Other sources of material may include Pat Hawker's column in Radcom (where he might have discussed this). His first column was 1958 so that might have been a bit late as it was decided by then (there's a 1958 article in NZ's Break In that also refers to LSB on 3.5 MHz and USB on 14 MHz). But he might have discussed it since. Also there was a move (I think in the 1970s or 80s) for the IARU to adopt a recommendation of a switch to USB on amateur frequencies below 10 MHz. I thing it was approved but never implemented by member societies.

    ReplyDelete
  6. Also of interest could be this, which refers to a 1959 ITU rule (249). However it's not clear why this would be relevant to plain amateur USB / LSB operating. https://ham.stackexchange.com/questions/1336/why-do-we-use-lsb-below-10-mhz-and-usb-above-10-mhz-when-operating-ssb-hf

    More on 249 here (not much more detail) https://search.itu.int/history/HistoryDigitalCollectionDocLibrary/4.275.43.en.1004.pdf

    A reply from W5DXP in the above thread talks about people at Texas A&M university using 5 MHz war surplus crystals and a VFO in the 8-9 MHz region in 1955. This would have covered both 80 & 20m with the sideband inverted on 80m. The Texas A&M Radio Club W5AC is older than even the ARRL and continues today. http://www.thebatt.com/life-arts/w5ac-continues-a-m-s-amateur-radio-tradition/article_975af076-9da8-11eb-af61-5b4b558e0651.html

    This might be the best lead thus far.

    ReplyDelete
  7. A stack of early SSB articles can be found here. http://www.one-electron.com/Archives/Radio/RadioSSB/

    There was a very active group of SSBers around 1948. Note this was a sunspot peak on a very good cycle. Australia's Amateur Radio magazine had items at the time talking about how to receive SSBSC, confirmation it was legal and articles explaining both filter and phasing methods.

    The Jan 1948 QST article by Arthur Nichols W0TQK describes an early filter SSB rig. This generated a DSB signal around 9 kHz. The upper sideband of this was filtered and converted up to about 550 kHz (this would have allowed troubleshooting on an AM radio). This was mixed with a VFO at 13.6 MHz. All mixing was addition so no subtraction or sideband inversion. The result was a USB signal at 14.2 MHz.

    Much discussion on this QRZ thread. https://forums.qrz.com/index.php?threads/an-urban-legend-disproved.462916/

    ReplyDelete
  8. Thanks Peter -- It is good to know that I'm not the only one trying to figure this out. A few comments on your observations:
    -- I don't think that obscure ITU finding is the source of the convention.
    -- It would be nice to think that a rig like my "Mythbuster" (5.2 MHz filter and 9 MHz VFO) is the source of the convention. After all, it DOES produce LSB on 75 and USB on 20 with no need for a second carrier oscillator crystal. But this begs the question: WHY did hams want USB on 20 and above? It seems clear why they wanted LSB on 75: so they could hang out at the top of the phone band. But why not use LSB for the same reason on 20? Also, it would have been EASIER to build a 9 and 5 rig that would NOT invert one of the sidebands: Just use a 5.2 MHz VFO (there were LOTs of WWII ARC 5 Command Sets available) and use it with a 9 MHz filter (you could make one from war surplus crystals and by the late 50's commercial filters were available). With this kind of rig you would could have been on LSB on 75 and 20, and you would not have had to fuss with stabilizing a VFO at 9 MHz. So the real question is: Why did they want to be on USB on 20 meters? 73 Bill

    ReplyDelete
  9. Thanks Bill.

    Agree that ITU thing was not really relevant.

    Bear in mind that ladder crystal filters (with all the crystals on the one frequency and would have permitted easy SSB) were not yet invented. Thus crystals in lattice filters needed to be (say) 1 or 2 kHz apart. Then there would have to have been a crystal on another frequency for the carrier oscillator. Amateurs could have manually ground crystals but this would be difficult if you didn't have exact frequency measurement equipment.

    From some of the material mentioned, there were a lot of war surplus crystals in the 450 - 500 kHz region and also in the 5 MHz region. Not only that but disposals crystals were in closely spaced channels. So if you were lucky with the channels available at your disposals outlet you wouldn't have grind crystals. Mixing and converting were well established so the real technical challenge for getting on SSB was its generation. There were so many 'unknowns' so something that reduced them, such as using surplus crystals on known frequencies would have been a godsend. What crystals you could get would have been the first thing that governed your mixing scheme and apparently a lot of 5 MHz types were available at closely spaced frequencies ideal for crystal filters. Also note that crystal filters had somewhat asymmetrical responses so it might have been easier to generate USB rather than LSB from them. And some of the really early generators that did their sideband generation around 10 kHz were USB.

    Phasing SSB was known but was slow to be embraced by amateurs due to (a) the complex maths and (b) the difficulty in getting exact values and stable parts for the audio phase shift network. And it was considered sort of a poor mans technique for people who didn't have crystal filters. However a phasing generator could be put onto any lower HF frequency and do either USB or LSB with just one crystal being required. So if you had a 9 MHz crystal then you could generate the SSB there and have your 5-5.5 MHz BFO for the two popular SSB bands. Some references mention that generating the SSB at 9 MHz was pioneered by the phasing SSB guys (who were really interested in economy/simplicity) but later adopted by filter SSB builders and manufacturers, especially when pre-made crystal filters with carrier crystals became available (although they cost a fortune for a long time).

    I think it's quite plausible that a mixing scheme like that in your Mythbuster was the source given such a scheme was used early on. There was the Texas A&M example from 1955 but I think one of the references had this scheme being used earlier.

    ReplyDelete
  10. Peter: I'm looking at Don Stoner's 1958 "New Sideband Handbook." He says that 9 Mc war surplus crystals were plentiful and easily fashioned into filters. He also says that in 1958 there were at least 3 commercial manufacturers of 9 Mc filters. Stoner even presents a schematic of a 9 Mc filter rig for 75 and 20. But -- and this is key -- he has two switched carrier oscillator crystals: one for USB and the other for LSB. So OM Stoner is not engaged in myth making!

    I'd like to think that a rig like the Mythbuster explains the convention, but I don't think it does. I don't think homebrewers were building rigs like mine. And we don't really see a commercial version until 1963 with the Swan 240.

    So I'm afraid it remains a mystery. Where DID the convention come from? Inquiring hams want to know!

    73 Bill

    ReplyDelete
  11. Been following Bill's question/comments and will admit I spent several enjoyable evenings last week reviewing my paper library and searching the internet for clues.

    Bottom Line Up Front (BLUF) did not find the convention's source. Not sure I am adding any new content to the comments already provided.

    - As discussed in the comments, to easily filter unwanted mixing products design drives frequencies higher and to improve stability design drives frequencies lower. Seems VFO and converter frequencies in the 2-10 MHz range were the best compromise in the decade following WW2 when amateur SSB matured. These frequencies and the ham band allocations may have setup an environment for sideband inversion on the lower bands.

    - I did see several designs that mixed a 450 kHz SSB signal with the VFO and then fixed frequency converter oscillator. No issue here other than to note you just need to do the math for sideband inversion. I did have an "oh yeah" moment when I read an article on mixing and realized sideband inversion also occurs on AM signals when converted, it just doesn't matter since both sidebands are identical - Right?

    Observations:

    - The 1968 RSGB handbook contains the following sentence (pg. 10.26): "It has become an accepted convention adhered to by all s.s.b. stations, to transmit low sideband below 10 Mc/s and high sideband above 10Mc/s as a result of a CCIR (International Radio Consultative Committee) recommendation." Bill, you cited the ITU theory, but the Handbook provides no further substantiation so discounting this one seems appropriate.

    - Several 1950's designs, included an option (or at least a discussion about) selectable sidebands for the filter and phasing SSB systems.

    - Found "Single Sideband Techniques", by Jack N. Brown W3SHY dated 1954 online which is a compilation of articles in CQ magazine. Sideband inversion is explicitly discussed. Also interesting is the 1954 Collins Radio advert for their mechanical 455C-31 (455 kc & 3.1 kc bandwidth) filter for US$35. Fast forward to 2021 and in today's dollars = US$356; an expensive part. No wonder phasing was the amateur amateur design choice. https://archive.org/details/SingleSidebandTechniques/mode/2up

    - Found the Collins Radio "Fundamentals of Single Side Band" here: http://collinsradio.org/archives/ssb_fundamentals/Fundamentals%20of%20Single%20Side%20Band-one-file.pdf All of the examples and diagrams use an USB signal, but the book acknowledges LSB and has options for it. Military and maritime is USB so that might explain the USB examples.

    - So a final question. Do you think many of our early SSB forefathers knew about sideband inversion (or were concerned about) which sideband they were actually transmitting? Seems easy to just tune in the signal on the received end to make it intelligible. The convention could just be a result of common mixing schemes at the time of widespread SSB adoption. (as previously discussed in the comments above).

    Regards.

    John Flint
    KAØLDB

    ReplyDelete
    Replies
    1. Hello John.
      To answer your question, yes, of course the early SSB builders knew all about sideband inversion. In 1960 Joe Galeski (not an engineer -- an optometrist!) discussed it in his QST article on his famous IMP transmitter:
      https://soldersmoke.blogspot.com/2021/08/joe-galeskis-1960-imp-3-tube-filter-ssb.html

      I think the myth cropped up later because many hams just misunderstood (and continue to misunderstand) how sidebands are produced in an SSB transmitter, especially about the circumstances in which one band will end up on LSB and the other on USB. 73 Bill

      Delete

Designer: Douglas Bowman | Dimodifikasi oleh Abdul Munir Original Posting Rounders 3 Column