While the electric force describes the exertion of one charge or body to another, we also have to remember that the two objects do not need to be touching physically for this force to be applied. For this reason, we describe the force that is being exerted through empty space (i.e. where the two objects aren’t touching) as an electric field. Any charge or body or thing that exerts an electrical force, generated most importantly by the distance between the objects and the amount of charge present, will generate an electric field.
The electric field generated as a result of two charges is directly proportional to the electric force exerted on a charge, or Coulomb force and inversely proportional to the charge of the particle. In other words, if the Coulomb force is greater, then the electric field will be stronger, but it will also be smaller if the charge it is applied to is smaller. Coulomb force as mentioned previously is inversely proportional to the distance between the charges. The electric field, E then uses the formula E = F/q and the units are Volts per meter.
By combining both Coulomb’s Law and our definition for the electric field, the electric field can be written as
E1 = ke * q1/r2 er
where er again is the unit vector direction from charge q1.
When drawing electric field lines, there are three rules pay attention to:
- The direction is tangent to the field line (in the direction of flow).
- The density of the lines is proportional to the magnitude of the electric field.
- Vector lines emerge from positive charges and sink towards negative charges.
Adding electric fields to produce a resultant electric field is simple, thanks to the property of superposition which applies to electric fields. Below is an example of how a resultant electric field will be calculated geometrically. The direction of each individual field from the charges is determined by the polarity of the charge.
Electric charge is important in determining how a body or particle will behave and interact electromagnetically. It is also key for understanding how electric fields, electric potentials and electromagnetic waves come into existence. It starts with the atom and it’s number of protons and electrons.
Charges are positive or negative. In a neutral atom, the number of protons in a nucleus is equal to the number of electrons. When an atom loses or gains an electron from this state, it becomes a negatively or positively charged ion. When bodies or particles exhibit a net charge, either positive or negative, an electric force arises. Charges can be caused by friction or irradiation. Electrostatic force functions similar to the gravitational force – in fact the formulas look very similar! The difference between the two is most importantly that electrostatic force can be attraction or repulsion, but gravitational force is always attraction. However for small bodies, the electrostatic force is primary and the gravitational force is negligible.
Charles Coloumb conducted experiments around 1785 to understand how electric charges interact. He devised two main relations that would become Coulomb’s Law:
The magnitude of the force between two stationary point charges is
- proportional to the product of the magnitude of the charges and
- inversely proportional to the square of the distance between the two charges.
The following expression describes how one charge will exert a force on another:
The unit vector in the direction of charge 1 to charge 2 is written as e12 and the position of the two numbers indicates the direction of the force, moving from the first numbered position to the second. Reversing the direction of the force will result in a reversed polarity, F12 = -F21.
The coefficient ke will depend on the unit system and is related to the permittivity:
The permittivity of vacuum, ε0 = 8.85*10^(-12) C^2N*m^2.
Coulomb forces obey superposition, meaning that a series of charges may be added linearly without effecting their independent effects on it’s ‘target’ charge. Coulomb’s Law extends to bodies and non-point charges to describe an applied electrostatic force on an object; the same first equation may be used in this scenario.
Rsoft comes with a number of libraries for real materials. To access these materials, we can add them at any time from the Materials button on the side. However, to build a Multilayer structure that can utilize many materials, select “Multilayer” under 3D Structure Type.
Now, select “Materials…” to add desired materials. Move through the RSoft Libraries to chose a material and use the button in the top right (not the X button, silly) to use the material in the project. Now select OK to be brought back to the Startup Window, where we must now design a layered structure using these materials. Note that while building the layers, you can add more materials.
Selecting “Edit Layers…” on the Startup window brings you to the following window. Here, you can define your layers by selecting “New Layer”. Enter the Height and Material of the layer and select “Accept Layer” and repeat the process until the structure is finished. Select OK when done and select OK on the Startup window if all other settings are complete. This is my structure. Note that my structure size adds up to 1. Remember what the size of your layers are.
Now, design the shape of the structure. I’ve made a rectangular waveguide. What is also important to consider is where the beam should enter the structure. By default, the beam is focused across the entire structure. In the case where a particular layer is meant to be a waveguide, this should be reduced in size. By remembering the sizes of the layers however it will not be difficult to aim the beam at a particular section of the waveguide. For my structure, I will aim my beam at the 0.2 GaInAsP layer. The positioning, width, height, angle and more of the launch beam can be edited in the “Launch Parameters” window, accessible through “Launch Fields” on the right side.
Finally, run a simulation with your structure!
There are cases where you may want to simulate a region of air in between two components. A simple way of approaching this task is by creating a region with the same refractive index as air. The segment between the two waveguides (colored in gray) will serve as the “air” region. Right-click on the segment to define properties and under “Index Difference”, chose the value to be 1 minus the background index.
Properties for the segment:
Symbol Table Editor:
Notice that in the “air” region, the pathway monitor detects the efficiency to be zero, though the beam reconvenes in the waveguide, if the gap is short and the waveguide continues at the same angle, but with losses.
Index grating is a common method to alter the frequency characteristics of light. In Rsoft, a graded index component is found under the “Index Taper” tab when right-clicking on a component. By selecting the tab “Tapers…”, one can create a new index taper.
Here, the taper is called “User 1” and defined by an equation step(M*z), with z being the z-coordinate location.
Selecting “Test” on the User Taper Editor will plot the index function of the tapered component:
The index contour is plotted below:
Here, the field pattern:
Light contour plot:
Launch Fields define where light will enter a photonic device in Rsoft CAD. An example that uses multiple launch fields is the beam combiner.
On the sidebar, select “Edit Launch Fields”. To add a new lauch, select New and chose the pathway. A waveguide will be selected by default. Moving the launch to a new location however will place it elsewhere. Input a parameter other than “default” to change the location, and other beam parameters.
Choosing “View Launch” will plot the field amplitude of the launches. For the plot below, the third launch was removed.
Right-clicking on the structure will give the option to chose the “Combine Mode.” Be sure that Merge is selected to allow waveguides to combine.
The Electro-optic effect essentially describes the phenomena that, with an applied voltage, the refractive index of a material can be altered. The electro-optic effect lays the ground for many optical and photonic devices. One such application would be the electro-optic modulator.
If we consider a waveguide or even a lens, such as demonstrated through problems in geometrical optics, we know that the refractive index can alter the direction of propagation of a transmitted beam. A change in refractive index also changes the speed of the wave. The change of light propagation speed in a waveguide acts as phase modulation. The applied voltage is the modulated information and light is the carrier signal.
The electro-optic effect is comprised of both a linear and non-linear component. The full form of the electro-optic effect equation is as follows:
The above formula means that, with an applied voltage E, the resultant change in refractive index is comprised of the linear Pockels Effect rE and a non-linear Kerr Effect PE^2.
The Pockels Effect is dependent on the crystal structure and symmetry of the material, along with the direction of the electric field and light wave.
When stringing multiple parts together, it is important to check a lightwave system for losses. BeamPROP Simulator, part of the Rsoft package will display any losses in a waveguide pathway. Here we have an example of an S-bend simulation. There appears to be losses in a few sections.
Here, the design for the S-bend waveguide has a few locations that are leaking, as indicated by the BeamPROP simulation.
The discontinuities are shown below, which are a possible source of loss:
After fixing these discontinuities, the waveguide can be simulated again using BeamPROP. In fact the losses are not fixed. This loss is called bending loss.
Bending loss is an important topic for wavguides and becomes critical in Photonic Integrated Circuits (PIC).
Rsoft has the ability to create multilayered devices, as was done previously using ATLAS/TCAD. Rather than defining a structures through scripts as is done with ATLAS, information about the layers can be defined in tables that are accessed in Rsoft CAD.
To begin adding layers to a device, such as a waveguide, first draw the device in Rsoft CAD. To design a structure with a substrate and rib waveguide, select Rib/Ridge 3D Structure Type in the Startup Window.
Next, design the structure in Rsoft CAD.
The Symbol Table Editor is needed now not only to define the size of the waveguide, but also the layer properties. The materials for this waveguide will be defined simply using basic locally defined layers with a user-defined refractive index. Later, we will discuss importing layer libraries to use real materials.To get used to the parameters typically needed for this exercise, layer properties may not need to be defined now before entering the Layer Table Editor.
The Layer Table Editor is found on the Rsoft CAD sidebar. First, assign the substrate layer index and select new later. The layer name, index and height are defined for this exercise.
After layers have been chosen, the mode profile can be simulated.
An interesting feature of BeamPROP simulations and other simulators in the Rsoft packages is that the simulation results can be displayed in a running animation. The following simulation is the result of a simulation of an optical fiber. BeamPROP simulates the transverse field in an animation as a function of the z parameter, which is the length of the optical fiber.
To design an optical fiber component with Rsoft CAD, select under 3D structure type, “Fiber” when making a new project.
To build a cylinder that will be the optical fiber, select the cylinder CAD tool (shown below) and use the tool to draw in the axis that the base of the cylinder is found.
Dimensions of the fiber can be specified using the symbol tool discussed previously and by right-clicking the object to assign these values. Note that animations of mode patterns through long waveguides is not only available for cylindrical fibers. Fibers may consist of a variety of shapes. Multiple pathways may be included. Simulations can indicate if a waveguide has potential leaks in it or the interaction of light with a new surface.
BeamPROP is a simulator found in the Rsoft package. Here, we will use BeamPROP to calculate the field distributions of our tapered waveguides. Other methods built withing Rsoft CAD are will also be explored.
The tapered waveguide that we are simulating is found below. We will use the BeamPROP tool to simulate the field distributions in the waveguide. We will also use the mode calculation tool to simulate the mode profile at each end of the waveguide.
BeamPROP Simulation Results
Mode Profile Simulation
The mode simulation tool is found on the sidebar:
Before choosing the parameters of the Mode Simulator, let’s first take a look at the coordinates of the beginning and end of the waveguide. This dialog is found by right-clicking on the component. The window shows that the starting point along the z axis is 1 and the ending point is 43 (the units are actually micrometers, by the way). We will choose locations along the waveguide close to the ends of the waveguide at z equals 1.5 and 42.5.
Parameter selection window:
Results at z = 1.5:
Results at z = 42.5:
Rsoft is a powerful tool for optical and photonic simulations and design. Rsoft and Synopsys packages come with a number of different tools and simulators, such as BeamPROP, FullWAVE and more. There are also other programs typically found with Rsoft such a OptoDesigner, LaserMOD and OptSim. Here we focus on the very basics of using the Rsoft CAD environment. I am using a student version, which is free for all students in the United States.
New File & Environment
When starting a new file, the following window is opened. We can select the simulation tools needed, the refractive index of the environment (“background index”) and other parameters. Under dimensions, “3D” is selected.
The 3D environment is displayed:
On the side bar, select “Edit Symbols.” Here we can introduce a new symbol and assign it a value using “New Symbol,” filling out the name and expression and selecting “Accept Symbol.”
Next we will draw a rectangle, which will be our waveguide. Select the rectangular segment below:
Now, select the bounds of the rectangle. See example below:
Editing Component Parameters
Right click on the component to edit parameters. Here, we will now change the refractive index and the length of the component. The Index Difference tab is the difference in refractive index compared to the background index, which was defined when we created the file. We’ll set it to 0.1 and since our background index was 1.0, that means the refractive index of the waveguide is 1.1. Alternatively, the value delta that was in the box may be edited from the Symbol menu. We also want to use our symbol “Length” to define the length of our waveguide. We also want this waveguide to be tapered, so the ending vertex will be set to width*4. Note that width may also be edited in the symbol list.
Here, we have a tapered waveguide:
The envelope of a signal is an important concept. When a signal is modulated, meaning that information is combined with or embedded in a carrier signal, the envelope follows the shape of the signal on it’s upper and lower most edges.
There are a number of methods for calculating an envelope. When given an in-phase and quadrature signal, the envelope is defined as:
E = sqrt(I^2 + Q^2).
This envelope, if plotted will contain the exact upper or lower edge of the signal. An exact envelope may be sought, depending on the level of detail required for the application.
Here, this data was collected as a return from a fiber laser source. We seek to identify this section of the data to determine if the return signal fits the description out of a number of choices. The exact envelope using the above formula is less useful for the application.
The MATLAB formula is used to calculate the envelope:
[upI, lowI] = envelope(I,x,’peak’);
And this is plotted below with the I and Q signals:
Here are two envelopes depicted without the signal shown. By selecting the range of interpolation, this envelope can be smoother. Typically it is less desirable for an envelope to contain so many carrier signals, as is the following where x=1000, the range of interpolation.
Further methods involving the use of filters may also be of consideration. Below, the I and Q signals are taken through a bandpass filter (to ensure that the data is from the desired frequency range) and finally a lowpass filter is applied to the envelope to remove higher frequency oscillation.
The development of advanced semiconductor technologies presents one important challenge: fabrication. Two methods of fabrication that are being used to in bandgap engineering are Molecular Beam Epitaxy (MBE) and Metal organic chemical vapour deposition (MOCVD).
Molecular Beam Epitaxy uses high-intensity vacuums to fabricate compound semiconductor materials and compounds. Atoms or molecules containing the desired atoms are directed to a heated substrate. Molecular Beam Epitaxy is highly sensitive. The vacuums used make use of diffusion pumps or cryo-pumps; diffusion pumps for gas source MBE and cryo-pumps for solid source MBE. Effusion cells are found in MBE and allow the flow of molecules through small holes without collusion. The RHEED source in MBE stands for Reflection Hish Energy Electron Diffraction and allows for information regarding the epitaxial growth structure such as surface smoothness and growth rate to be registered by reflecting high energy electrons. The growth chamber, heated to 200 degrees Celsius, while the substrate temperatures are kept in the range of 400-700 degrees Celsius.
MBE is not suitable for large scale production due to the slow growth rate and higher cost of production. However, it is highly accurate, making it highly desired for research and highly complex structures.
MOCVD is a more popular method for growing layers to a semiconductor wafer. MOCVD is primarily chemical, where elements are deposited as complex chemical compounds containing the desired chemical elements and the remains are evaporated. The MOCVD does not use a high-intensity vacuum. This process (MOCVD) can be used for a large number of optoelectronic devices with specific properties, including quantum wells. High quality semiconductor layers in the micrometer level are developed using this process. MOCVD produces a number of toxic elements including AsH3 and PH3.
MOCVD is recommended for simpler devices and for mass production.
Heterojunction is the term for a region where two different materials interact. A Heterostructure is a combination of two or more materials. Here, we will explore several interesting cases.
The AlGaAs-InGaAs interaction is interesting due to the difference in energy bandgap levels. It was found that AlGaAs has a higher bandgap level, while InGaAs has a lower bandgap. By layering these two materials together with a stark difference in bandgap levels, the two materials make for an interesting demonstration of a heterostructure.
The layering of a smaller bandgap material between a wider bandgap material has an effect of trapping both electrons and holes. As shown on the right side of the below picture, the center region, made of AlGaAs exibits high concentrations of both electrons and holes. This leads to a higher rate of carrier recombination, which can generate photons.
Here, the lasing profile of the material under bias:
A commonly used group of materials is InGaAsP, InGaAs and InP. Unlike the above arrangements, these materials may be lattice-matched. Lattice-matching may be explored in depth later on.Simulations suggest low or non-existent recombination rates. Although this is a heterostructure, one can see that there are no jagged or sudden drastic movements in the conduction and valence band layers with respect to each other to create a discontinuity that may result in a high recombination rate.