It’s 1.3 um on the object side. So 13 um on the image side, whatever the metric is
The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm.
It’s 1.3 um on the object side. So 13 um on the image side, whatever the metric is
The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm.
@DonaldB has written: @JimKasson has written: @DonaldB has written:…so your going to try to record 150 lines/mm which is basically the same specs as any other 10 x objective , my 10x olympus spec is 1.3um at 10 mm working distance.
150 lp/mm, not 150 lines/mm.
Is the 1.3 um on the image side or the object side? And what, precisely is the metric. If it’s PSF diameter and on the image side, how are you going to resolve it with a 5 um sensor?
It’s 1.3 um on the object side. So 13 um on the image side, whatever the metric is
The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm.
i dont know the maths , i can only gauge my lens performance on bug bits 🤨
so is that why they say 5 um to 11um for best results ?
The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm.
what does this mean to a layman 🤔
The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm
but isnt that for a projection of 100x100mm image arnt you going to squash the projection for your MF sensor
[deleted]
@JimKasson has written:The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm.
This is very close to the diffraction limit - what you can possibly image with light. Do you have photos with such detail?
At the recommended 436 nm spectral line and magnification ratio, the lens is supposed to be diffraction limited. But note I’m talking about using the lens reversed, so the sensor side sees only 150 lp/mm.
The lens was designed for stepping semiconductor lithography.
@JimKasson has written:The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm
but isnt that for a projection of 100x100mm image arnt you going to squash the projection for your MF sensor
My sensor is well under 100x100 mm.
@DonaldB has written: @JimKasson has written:The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm
but isnt that for a projection of 100x100mm image arnt you going to squash the projection for your MF sensor
My sensor is well under 100x100 mm.
yes thats right but your lens was designed to project a transparency of 100x100mm onto a micro chip !
but arnt you only projecting the image onto mf sensor in reverse
@JimKasson has written:The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm.
This is very close to the diffraction limit - what you can possibly image with light. Do you have photos with such detail?
if this is the case from back in the 80s 40 years ago then how are we making circuits so small today ?
@JACS has written: @JimKasson has written:The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm.
This is very close to the diffraction limit - what you can possibly image with light. Do you have photos with such detail?
if this is the case from back in the 80s 40 years ago then how are we making circuits so small today ?
did some googling. very interesting. much more advanced to when i used to make my own PCBs when i was into electronics 45 years ago.
what does this mean to a layman 🤔
That, given you have 10x/0.25 objective, for green 550 μm wavelength you need at least a 5.5 μm pixel, and depending on other factors, smaller pixels may be better. Look up the Rayleigh criterion.
@JimKasson has written: @DonaldB has written: @JimKasson has written:The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm
but isnt that for a projection of 100x100mm image arnt you going to squash the projection for your MF sensor
My sensor is well under 100x100 mm.
yes thats right but your lens was designed to project a transparency of 100x100mm onto a micro chip !
but arnt you only projecting the image onto mf sensor in reverse
The lens was designed for 1:10. In that usage the wafer side image circle diameter was 14mm and the mask side image circle was 140mm. So as a 10x lens on a 33x44 mm sensor, the object side field of view will be 3.3x4.4mm.
@JACS has written: @JimKasson has written:The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm.
This is very close to the diffraction limit - what you can possibly image with light. Do you have photos with such detail?
if this is the case from back in the 80s 40 years ago then how are we making circuits so small today ?
Shorter wavelengths, some tricky optical schemes.
@DonaldB has written: @JACS has written: @JimKasson has written:The lens I have is 1500 lp/mm on the object side, for a line width of around 400 nm.
This is very close to the diffraction limit - what you can possibly image with light. Do you have photos with such detail?
if this is the case from back in the 80s 40 years ago then how are we making circuits so small today ?
Shorter wavelengths, some tricky optical schemes.
truly amazing
truly amazing
For sure. To get a bit of an idea of scale, if you enlarged the diameter of a human hair of 0.1mm up to the height of the Empire state building, one nm would be less than 6mm high on that scale. And we take for granted our computer chips running 3-5nm circuitry. For want of a better word. Truly mind boggling.
@DonaldB has written:truly amazing
For sure. To get a bit of an idea of scale, if you enlarged the diameter of a human hair of 0.1mm up to the height of the Empire state building, one nm would be less than 6mm high on that scale. And we take for granted our computer chips running 3-5nm circuitry. For want of a better word. Truly mind boggling.
did you read where the computers that control the lasers have deep learning AI built in and adjust the control accuracy at 50,000 times per sec , we think the sony a1 is amazing at 1/400 sec 🤔 i suppose you would expect that with 6 billion dollars spent on development 🤨
@Ghundred has written: @DonaldB has written:truly amazing
For sure. To get a bit of an idea of scale, if you enlarged the diameter of a human hair of 0.1mm up to the height of the Empire state building, one nm would be less than 6mm high on that scale. And we take for granted our computer chips running 3-5nm circuitry. For want of a better word. Truly mind boggling.
did you read where the computers that control the lasers have deep learning AI built in and adjust the control accuracy at 50,000 times per sec , we think the sony a1 is amazing at 1/400 sec 🤔 i suppose you would expect that with 6 billion dollars spent on development 🤨
What I read is that the lasers that are aimed at the tin droplets have a pulse repetition rate of 50kHz.
The wafer positioning control system operates at 20kHz, with 250 picometer repeatability.
Note that their first systems operated at 436 nm, which is the wavelength for which my lens was designed.
@Ghundred has written: @DonaldB has written:truly amazing
For sure. To get a bit of an idea of scale, if you enlarged the diameter of a human hair of 0.1mm up to the height of the Empire state building, one nm would be less than 6mm high on that scale. And we take for granted our computer chips running 3-5nm circuitry. For want of a better word. Truly mind boggling.
did you read where the computers that control the lasers have deep learning AI built in and adjust the control accuracy at 50,000 times per sec , we think the sony a1 is amazing at 1/400 sec 🤔 i suppose you would expect that with 6 billion dollars spent on development 🤨
I didn’t see in write up, but you can’t use refractive lenses at that wavelength. It’s all done with mirrors. The optics come from Zeiss, with defect correction at the molecular level. The optical path needs to be in a vacuum, since air absorbs EUV.
Even the mask making is exotic.