Category Archives: Electronics

Christmas LED Lights Circuit

Using this simple Christmas LED lights decoration circuit, you can make an 18 LED flasher to decorate the Christmas Tree. The White, Blue and Red LEDs flash at different rates to give a colorful display. It is a light sensitive circuit so that it will turn on in the evening automatically and stays on till morning.

The circuit uses the popular Binary counter IC CD 4060 to flash the LEDs at different rates. Components C1, VR2 and R1 form the oscillator and the output pins 7, 5 and 4 become high / low sequentially. When one output turns high, a set of 3 LEDs turn on and when the same output turns off, the second set turns on. This sequence is similar in the other two sets of LEDs also but with different timings. The speed of the Flashing can be controlled through VR2.

Christmas LED Lights Decoration Circuit Schematic

LDR is provided with VR1 to activate the IC in the evening. In day light, LDR conducts and keep the reset pin 12 of IC1 high to inhibit it from working. When the day light ceases, pin12 becomes low and the flasher starts working. VR1 adjusts the sensitivity of LDR at the required light level. If more LEDs are required, increase the supply voltage to 18 volt DC. The circuit can be powered using a standard 12-18 volt 500 mA adapter.Use High bright transparent LEDs for attractive display.


LDR PC Desk Lamp

Most of the PC desk lamps available in the market light up whenever there is an input power. These don’t take into account whether there is a real need for the light or not. Here is an intelligent PC desk lamp circuit that overcome the problem.
It senses the light level in the room to determine the actual need for light and lights up only if required. It is designed to work with the PC and remains on only when the PC in the table is in working state. It uses MOC3021.

Front end of the circuit is powered by the 5volt dc supply available from the usb port of the PC. When circuit is powered, the light sensor LDR (R2) resistance is low if there is sufficient light and thus most of the base current of transistor T1 finds an alternative easy path via LDR and T1 remains cutoff. While during dakness, the LDR behaves almost as an open circuit, and the current through sensitivity control preset pot (P1) and assosiated resistors (R1,R3) flows into the transistor’s base. As a consequence, T1 conducts to energise the opto-triac PC1. Next, the lamp driver triac T2 is fired through the opto-triac PC1 and switch on the power supply to the incandescent lamp.

The circuit can be constructed on a medium size PCB. After construction, enclose the finished circuit in a well insulated plastic cabinet. Then drill holes for mounting the ‘B’ type USB input socket, power switching termianls and the LDR etc. This circuit is meant for use in conjuction with Personal Computers to switch on an associated light sensitive table lamp/similar load. An optional electro magnetic relay can also be wired at the output of the circuit to switch heavy electrical load(s). For interconnection between PC and the control circuit, use a standard USB cable with an ‘A’ type connector on one end and a ‘B’ type connector at the other end.

LDR USB Desktop Lamp Circuit Schematic

Warning! This LED PC Desk Lamp circuit is perfectly isolated from mains. However some parts of the circuit carries dangerously high voltages. So ,while testing,using or repairing take extreme care to avoid the fatal electric shock.


Commercial Circuit Simulator Goes Free – Hackaday

If you are looking for simulation software, you are probably thinking LTSpice or one of the open-source simulators like Ngspice (which drives Oregano and QUCs-S), or GNUCap. However, there is a new free option after the closing of Spectrum Software last year: Micro-Cap 12. You may be thinking: why use another closed-source simulator? Well, all the simulators have particular strengths, but Micro-Cap does have very nice features and used to retail for about $4,500.
The simulator boasts a multipage schematic editor, native robust digital simulation, Monte Carlo analysis, 33,000 parts in its library, worst-case and smoke analysis, Smith charts, and it can even incorporate spreadsheets. There’s a built-in designer for active and passive filters. Have a look at the brochure and you will see this is a pretty serious piece of software. And now it’s at least free as in beer.

The number of models supported for active devices is impressive and includes some very recent MOSFET models, not just the old standard models. It can also read just about any regular Spice or IBIS model. It can also export Spice files if you want to use another engine or share designs with other Spice users. There are also quite a few examples provided. There are also over 2,000 standard digital parts including all the usual 7400 families, CD4000 CMOS, and even ECL.
As a bonus, we tried it under Wine and it worked well — at least the 32-bit version. The 64-bit one would probably work with a little effort. On a big monitor, you might want to use Winecfg to set a higher DPI setting, although the toolbar icons are fixed in size which is a little inconvenient. You can, however, select “large toolbar” on the Options | Preference menu, which will help.
One nice touch is that you can view a simulation and interactively change component values and watch the results update right away.
We frequently use Spice when we are too lazy to do the math required to pick an optimal set of values. With this software, you can set ranges for various circuit components, tell the program what you want to optimize, and it will compute the best values for you.
The smoke analysis is somewhat unique. The idea is to run a transient analysis and the program determines if any circuit values exceed the maximum value for a component. You get a nice colored graph that tells you how close you are to smoke or, if you have some red bars, what parts will smoke.

Another neat feature is that you can create very cool 3D plots. This is especially useful if you are stepping parameters or measuring the effect on parameter variation like temperature.
One other feature we liked is that the program can output a netlist for printed circuit board programs including Protel, Accel, Orcad, and PADS. Over 18,000 components in the library have packages available and there is a package editor. We wish it would work with KiCAD, although we are pretty sure you could figure out some conversion path from one of the formats available.
The software was under development since 1982. We don’t know the circumstances of Spectrum’s closing but we hope it was to move on to something great. However, we appreciate the free release of this powerful simulator that can give LTSpice a run for its money. True, we expect there won’t be future development, but the package seems very complete and with the ability to import models, it will be very useful for a long time to come.
If you are trying to learn the program, there are some starting instructions for an older version that should get you the basics. You can also find the user’s manual and a reference manual on the site.
We went looking for tutorials and found that [Kiss Analog] just started a set of video tutorials. There’s only one complete, so far (see below), but we are sure there will be more on the way.

If you’d rather do LTSpice, we have a tutorial. Then again, for just playing around, the Falstad simulator is pretty nice and requires no installation.
The “Blender” of circuit-simulation.
Except Blender provides source code… 🙂
I hope it has better UI than Blender had before 2.8…
I don’t use Blender, so I don’t quite understand the analogy. Does it mean it’s good or bad?
It means “very good”
Blender is for 3d animation and modeling, is open source and free, and is in the top of the class for that type of software, rivaling some pretty expensive commercial software. Used by professionals too.
If you go to youtube and search “blender”, just look at the thumbnails, and perhaps read the snippit of description, you’ll get an idea. (If you aren’t into 3d animation there’s no real need to watch any of the videos)
Blender is a great, industry standard 3D modeling and animation software. And video editor. And 2D graphics and animation tool. And much more than that. It can’t replace LibreOffice or GIMP, yet. But it will probably be added in one of future releases…
Why would Blender have to replace LibreOffice or GIMP? You don’t use pliers to hammer in nails neither, do you?
Super Pliers 3000 Extended Edition comes with a built in atomic powered nail insertion feature.
So yes 😛
I might if I don’t have a hammer or the pliers are closer.
I used piers to hammer nails since I was young. It was the right size feel and weight. I don’t do a lot of hammering. I finally bought one a few years ago at a dollar store for flattening home made PCB vias and the price was right.
I don’t use any of the 3 packages either, but I do use spice,
Typically it would mean really good and powerful, professional quality, and free. ..along with a UI thats completely borked and has a massive learning curve to get used to doing things with different inputs than any other software. Not sure if that applies here, but thats what blender is.
Not exactly. Blender came to the masses because a guy who worked at the company that developed it took over the source code so that development could continue after the commercial effort ended. That is not yet the case here.
Not quite. Blender was closed source and one of the developers offered to buy it from the company. The company gave him a number, and he raised the money to pay for it, with donations, in order to make it open source. In this case it’s basically abandonware. If you invest your time into learning it, then you will be stuck without any new features etc.
source of that statement please?
and disagrees with your statement.
Yeah well that’s an example of a winner writing it’s own history. Some of us where around when it happened, plus it’s recorded elsewhere on the web what happened as well.
So what did happen ?
Just my memory.
Yes, it was originally a closed-source, in-house tool, but free for anyone to use if you had the hardware/OS to run it. ( Your last two sentences, though, sharply veer off into left field and abandon reality where Blender is not just constantly being worked on, but major companies are heavily investing in it. ( Please make a practice of doing fact-checking before posting to avoid looking foolish in the future.
Seems to me that wasa comment regarding the software referred to in this article. I could be mistaken though.
Chris Bruner was talking specifically about Blender and made a statement that was UTTERLY untrue about its current state; to say that is it ‘abandonware’ indicates that he didn’t even TRY to check the facts.
He didn’t say Blender is abandonware. First he describes what happened to Blender, then says “In this case”, and switches to what’s happened with Microcap.
Can anyone please suggest application to draw basic electrical circuits like in textbooks.
For papers it seems like the two most common options are Microsoft Visio and LaTeX using the circuitikz package. I am personally not a fan of Visio but it does get the job done with minimal learning curve. LaTeX gives excellent results but the learning curve is pretty steep.
If you just want nice looking schematics (not necessarily for publication) I would suggest getting comfortable with your EDA software of preference (I like Kicad)
I used Visio at one place I worked because it was closest thing to a CAD that I can use without the hassle of getting approval for req. as it was classified as general office software.
I wrote a converter program to convert our CAD footprint and component models into it and used that a lot for placement plannng. Being able to export Auto CAD DXF for our mech people, window metafile for word was very handy.
Visio is pretty expensive. is free, even for commercial use, and is pretty comparable.
KiCad can export schematic drawings to .svg files, which should be importable in any decent word processor or desktop publishnig program.
I agree. TinyCAD is quick, easy, reasonable library. Easiest way to bang out a schematic. Just wish it was ported to ‘nix.
Apparently Linux TinyCad is completely unrelated. And not updated for two years.
Dia might be a suitable replacement, kind of like Visio.
qelectrotech is nice (
Dia (like visio)
some links:
I draw circuits with LTSpice. under electrical tab is some all that you should need
Why wouldn’t they have also release source code ? This could be great.
Did anyone ask?
Companies don’t like to give away things they work hard for free. Just n case someone else want to buy that right perhaps.
But also, releasing source code may have a cost. I seem to recall reading that when Netscape went open source, the company spent time and money preparing the code. I can’t remember details, but the code couodn’t be released as is. Maybe it needed to be cleaned up.
When the mail/newsreader PINE was let go from the University of Washington, the license was changed and some work was needed. Maybe it was one last update, but I vaguely recall more work was required.
They’re not allowed to open source software components that they bought from someone else. The program code very likely includes packages that they don’t own the rights to.
Spice software quite often contain trade secrets. i.e. encryption that protect chip vendors IP, so that they would release a more accurate representation of what’s inside. Spice models are serious business for simulation as it is only as good as the models.
Also there are some serious legal consequences if the software vendor rectroactively open sourced the code.
When LucasArts closed, Raven released the source code from the Jedi Knight games they’d worked on.
They quickly took it back down, because the code contained a lot of proprietary stuff from Microsoft and Bink that they didn’t have the rights to publish. It was a couple days before the source was sanitized enough for publishing again.
Releasing the source for a product can be a lot more complicated than uploading the code.
Thanks for sharing! This might come in handy and I have no issues with free as in beer software.
If I could get this to run on Linux Mint it would actually be useful. But between Wine, PlayonLinux, Mono, and Microcap, no such luck.
I am running Microcap just fine with Wine. Make sure you have the right Wine installed (32bit vs 64bit for the version of the executable you are trying to run).
I can confirm the 32 bit CD version works with the playonlinux 4.3.4 package
open playonlinux
click install a program
click install a nonlisted-program
install a program in a new virtual drive
name it microcap12
select use another version of wine
choose wine 4.21
choose 32 bit Windows installation
browse the microcap12 extracted CDROM files path
choose “setup.exe”
do not run microcap program on install completion
Click playonlinux Configure
select microcap12 virtual drive
Click make a new shortcut from this virtual drive
select mc12.exe
exit wizard
exit Configure
click mc12 on playonlinux list menu
I was running the windows 7 OS signature with playonlinux’s 32bit wine 4.21 during this process
Note, installing wine 4.21 can be done though playonlinux Configure menus
Thanks… turns out my problem was Ubuntu, who screwed up a library needed to install Wine; sorting it out now on Mint forum…..
Aaaaand it works! Just like olden days, but back then there weren’t memristors in the library.
Had to disable mscoree in winecfg and it played nice.
The executable only will fail because it doesn’t have the support files which the cd package does.
The company behind MicroCap has closed after its founder has retired. It was pretty much a one-man shop. So there won’t be any more updates but they are still fixing bugs (the last release was in November or December last year).
There is another simulator has been freewared recently: SuperSPICE ( Same thing – one man shop moving on.
Well, the chap was/is one hell of a software engineer, because this looks great. Another string in my bow of simulation tools.
humor too,
Wow he uhh…sure has strong opinions on Islam.
He is an “ardent atheist”, so of course he is going to have strong opposition to Islam.
The only nice things I can say about his stance on religion is that it was hard to dig through the website to find. Let’s leave it alone?
Unfortunately, he isn’t wrong. :sigh:
Hackaday guys – you should track him down and interview him!
Yeah yeah, Nah, FOSS or GTFO.
What a greedy attitude.
Possibly more like not wanting to get stuck learning software that relies on a small set of other people that have the sources to update/fix/… (that perhaps aren’t even available, as possibly in this case)
I don’t want to waste time on software with poorly designed UI or one that was an after thought. Poor and out of date documentations and almost vegan type of politics are a turn off.
You assume he is being greedy, but FOSS means he can contribute to development. Maybe he wants to add new features?
C’mon now, how many people who talk like that actually contribute a damn thing back? Well, except for complaining on Github I guess…
The logical falacies abound, confirmation bias, ancedotal evidence, false equalvilances…
The fact is that there is no way to say how many people who talk like that actually contribute to the projects as there is no statistical analysis of people who talk like that.
That being said, it is very presumptious to think that your personal experiences of people who talk like that stand for a representation of the general public of FOSS projects.
Then it is also illogical to chalk people who talk like that specifically to be greedy but that person could just be anti authoritarian.
I dont often contribute specifically to FOSS projects, I write detailed bug reports when i run into them but i dont contribute code except for one or two projects. Does this mean i am greedy because i run several different packages but i only contribute to one or two? Does that mean that i am only allowed to run the packages that i contribute to?
In my opinion, closed source software is greedy and open source is cheap. the original retort of greedy is wrong, they should have said cheap ass. Im neither of those posters but yes i am a cheap ass and i use FOSS because i dont have to pay some large conglomerate for their shitty software that is just as buggy as any other piece of software. The fact is that the more people who use FOSS encourage even more people to use FOSS creating a network effect that benifits the group more than the individual, so championing for FOSS even without code contribution is still a contribution and thus not really greedy.
I can’t reply to Mike for some reason, so I’ll reply here.
“The logical falacies abound, confirmation bias, ancedotal evidence, false equalvilances… ” You just throw it out there, as if you had any hard data that contradicts the views.
Open source projects can be successful, but only if they reach a large size and financial support. To claim otherwise shows a lack of experience.
Other projects rely heavily on the code originator, or someone who took over. Most of the contributions end up being fixes.
For anything larger you need a lot of organization to integrate the submissions and ensure a certain quality. Almost nobody can do that on the side, unpayed.
“The fact is that there is no way to say how many people who talk like that actually contribute to the projects as there is no statistical analysis of people who talk like that. ”
Oh please! Attitude says a lot about what to expect from people. Asking kindly or saying code or GTFO is not the same.
Asking for statistical data for everything is just a cheap way to disagree. Let alone you can’t evaluate everything statistically, because categorizing it is yet again a subjective faulty process.
Maybe you should try Bayesian statistics to understand how people are able to act with incomplete information.
“In my opinion, closed source software is greedy and open source is cheap.”
The rest of your post is just based on your wishes, and shows that the prediction was right. You think network effects just make it happen magically. You added no statistics of your own, either.
You simply hate against individual software developers who make a living from software, and compare them to big conglomerates. You are really brave, I admire you.
A little tip: the most successful open source software is actually financed by big congolmerates, such as Google, Oracle, MS, IBM etc. Yes, especially Linux, who would have thought…
“Then it is also illogical to chalk people who talk like that specifically to be greedy but that person could just be anti authoritarian. ”
“GTFO” is not anti authoritarian, it’s plain rude and respectless for the work someone has done.
People have to understand that you can’t treat single developers and big corporations the same way. Yes, those are also single people you insult, actually the lifetime of their work.
Ever thought of that?
To that I have to relate this story. I volunteer at a small rural county museum, where the coordinator relies on her son. as her computer tech advisor. He is someone who is big on FOSS concept. My response was I like the concept, but it has a butt loade of hard core rigid unmoving attitude. Problem being one can never know if the whiners are the doers, or the freeloaders, who never donate financial support to the projects they use on a daily basis. e My opinion is that the whiners are the free loaders, not doers, because that how it is in the real world
Agreed. I have made freeware and open source software.
After open sourcing what I see most of the time is the same type of whiny comments. If the project isn’t huge, most of the work is done by one person or a handful.
The more rude in demanding they were, the less they contribute.
I’ll bet $100 you wouldn’t know what to do with the source code if you had it.
Opening former closed source products is much harder than it seems, as their source code may contain 3rd parties closed parts that they have no rights to release. Sorting out and ditching these parts would cost money, and swapping them with FOSS code to make a product that works again would cost a fortune. Releasing the source code as-is would expose them to nasty lawsuits. It’s not always about greed.
I’ve noticed that after installing the CD version, that there were updates that could be downloaded.
I imagine the website will not be maintained indefinitely, so ideally some way needs to be found to update the CD installer files with the latest files that can be downloaded in the update.
And of course ideally, the whole thing needs to be be made open source and put somewhere the community can continue the development.
Good day. I am looking for a simulator for electronic circuits.. Do you think this one will help?
As mentioned in the article a fun and workable place to start is just
Wow, this gave me a huge flashback. My final year college project in Electronic Engineering in University College Dublin in 1990 involved a comparison of Microcap vs PSpice!
At the time, IIRC, Microcap won hands-down for usability but PSpice won on accuracy, compared to the real (albeit very simple) circuits we built as references.
I just had to download the latest version to have a look. Yup, it’s come on a long way in 30 years 😀
There goes the morning…
I don’t mean to be negative. It’s very generous that the developer will let us use it for free.
Eventually we all die.
I don’t think I want to spend to many of my living moments learning a tool that will never again see an update. If the source were released so that it could be forked and live on then that would be different.
Excuses, excuses.
> Released in June 1980, this product was the first integrated circuit editor and logic simulation system available for personal computers. Its primary goal was to provide a “circuit creation and simulation” environment for digital simulation.
You are looking at code that had been maintained for almost 40 years! It is older than some people here. The code should be more than mature that it should be relatively bug free for general use anyways.
There’s no code that’s so bug-free that it can’t have new features added to remedy that.
Yes totally agree, I want long term corporate support like I get for my symbian phone, err, my microsoft Windows Phone that replaced it, err my Google+ account, sorry I meant my Sonos……
No one-man-band 40 year fly-by-nighters for me!
This was not a product endorsement/ recommendation, simply news. Thank you Al for making many of us aware of, an option we otherwise wouldn’t be aware of. The only downside could be, in time, the only user support will be available from a user community, if that’s a downside.
This is some top notch software and you’d be a fool not to get it and learn it because it’s very powerful …. I’ve been using it since the days it ran on DOS 6.x and to get the full features you needed a 486 DX computer that had the math co-processor …. The Windows 95 version pretty much set the tone and standard for everything that came after it including LT Spice
This program’s father was SPICE, the FIRST circuit analysis program written in Fortran in the 60’s. Then came PSPICE and then bought or cloned by MicroSim, and I assume improved by Spectrum. My guess/opinion is that it is the BEST. I used and taught it for many years. Bo W4GHV since ’54
Nice! thanks for the heads up!!
has been buggy on win 8.1
It’s Win8.1 that is buggy!
What was said above.
Can it calculate EDS or missing resistance values?
In general, simulators model finished circuits. I don’t know if µCap has any solver functions, but I’d be surprised if it did. You can probably use it to brute-force a solution by having it iterate on values, but unless you really don’t know how to solve something, this is not the way to do things.
Not true. Teaching a college lab for a complex amplifier design I tinkered with PSPICE while the students sweated. I did a rough design myself, swept component values, etc. I finished before the students and used it as a great learning tool. BTW I caught bitching by the older professors for teaching calculator use. They remained in the stone age.
How to download
I run this on my MacBook Pro with Parallels installed so that I can run Windows based programs. Works great – and fast! I personally think the GUI is as intuitive and pleasing to the eye as any I’ve used. Cheers!
Thank you so much for the link to my YouTube channel! My little channel had a Huge boost! So great of you – and I love this site!!
Getting an error No such file or directory ‘C:Usersusernamedownloadsmc12standard.cmp’ Anyone have a solution? Doing this with both the x86 and 64 bit executable.
Ah, never mind, apparently the ‘executable only’ download doesn’t work. Have to download the CD version.
Just rename standard0.cmp to standard.cmp
The Spectrum software website is now gone 🙁
CD Eng version available
It is actually still up – or just the download page is. Search for that and don’t go to the home page. Cheers
Is there a kind of MicroCap Forum where you could ask your questions in case of problems? I dont see much like this on the web… If there is no support at all or kind of forum i dont see much future for MicroCap.
I do not know whether the Spectrum site was gone at the end of March, but today it was there and accessible. I just downloaded Micro-Cap 12 directly (and checked the files in Virus Total and they were all green).
Full disclosure: I haven’t attempted to install it.
I don’t know if the Spectrum software website was gone at the end of March, but it was up and accessible today. A few minutes ago I downloaded Micro-Cap 12. I checked all the .zip and .pdf fles with VirusTotal and they were all green.
Full disclosure: I have not attempted to install yet.
FWIW, the FOSS PCB layout tool pcb-rnd can import MicroCap netlists, so that’s a complete toolchain if you want to use MicroCap
It seems there is no forum for keeping this tool going. If you know of one please replay.
Please be kind and respectful to help make the comments section excellent. (Comment Policy)
This site uses Akismet to reduce spam. Learn how your comment data is processed.
By using our website and services, you expressly agree to the placement of our performance, functionality and advertising cookies. Learn more


Will Reversible Logic Save Moore's Law? It's Difficult to Say – DesignNews

Design News is part of the Informa Markets Division of Informa PLC
This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC’s registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.
See All Automation »
See All Automotive Engineering »
See All 3DP »
See All Electronics »
See All Design Software »
See All Materials »
See All Industry »
See All CEC »
Soham Bhattacharya | Sep 27, 2021
In this era of semiconductor technology, reversible computing plays a vital role in diminishing power dissipation and the increment of energy efficiency in building up processors or any kind of simple or complex digital logic circuits. Several research areas have been going on in the modern world which could improve in an existing or impressive manner energy efficiency.
Moore’s law states that as transistors were made smaller, they would become faster, cheaper, and more efficient. Because of this fact, the semiconductor industry is busy building smaller, more densely packed transistors for fabrication. But nowadays, some experts in industry and academia say that making the transistor size smaller will no longer yield the improvements it used to. The reason behind building chips with multiple cores is that smaller transistors made the clock speeds stagnant over a decade ago. Even these multiple-core processors have faced the “dark silicon” problem; the space of the chip must be powered off to avoid overheating issues. Also, Dennard Scaling failed to pose a severe design challenge where the voltage scaling and transistors scaling are not in line.
Related: Open-Source Verification not as Easy as Design
The dark silicon problem has caused several issues with the power dissipation and design implementation of the chips. Meanwhile, applications of reversible computing play a vital role in reducing power dissipation problems in chips. The thought of reversible computing goes to the true heart of thermodynamics and information theory. It is the only conceivable way inside the laws of physics that we could continue to work on the expense and energy effectiveness of general-purpose computing. To know how the reversible computation concept has emerged, we have to go back to the past a little bit.
Physicist Rolf Landauer of IBM distributed a paper in 1961 named “Irreversibility and Heat Generation in the Computing Process.” Landauer contended that the logically irreversible character of regular computational activities has direct ramifications for the thermodynamic conduct of a device that is doing those operations. Landauer’s decision, which has since been tentatively affirmed, was that each bit eradication should disperse around 17-thousandths of an electron volt at room temperature. This is a small measure of energy, yet given all of the tasks on a PC, it adds up.
Related: Two Myths About Silicon Photonic Chips
Present-day CMOS innovation does less well than Landauer determined, dispersing something in the neighborhood of 5,000 electron volts for every bit deleted. Standard CMOS plans could be worked on in such a manner, yet they will not at any point have the option to get much underneath around 500 eV of energy lost per bit deleted, still a long way from Landauer’s lower limit.
Later, Charles Bennett, who showed in 1973 that it was feasible to develop completely reversible PCs fit for playing out any calculation, tried to fix the tasks that delivered moderate outcomes. This would permit any brief memory to be reused for resulting estimates while never eradicating or overwriting it. Thus, reversible calculations, whenever carried out on the right equipment, could evade Landauer’s cut-off. 
Unfortunately, Bennett’s concept of utilizing reversible processing to make calculations undeniably more energy efficient has been mulled in scholarly backwaters for a long time. The issue was that it’s tough to design a framework that accomplishes something computationally intriguing without causing a lot of entropy increment with every activity. However, innovation has improved, and the need to limit energy use is currently intense. So, a few scientists are indeed looking at reversible processing to save energy.
Two scientists (Edward Fredkin and Tommaso Toffoli) in the late 1970’s and early 1980’s took the issue more seriously and created two necessary abstract computational primitives named “Fredkin Gate” and “Toffoli Gate.” Moreover, several other gates came into play afterward, like Feynmann Gate, Double Feynmann Gate, HNG Gate, TSG Gate, and many more. These gates helped a lot to utilize the concepts to implement different combinational and sequential logic gates, which have been in the semiconductor industry.
Ressler had taken the first step in investigating the requirements for reversible computers. He had implemented a simple accumulator-based machine using only Fredkin gates. He talked about control stream issues and the idea of a garbage stack to hold additional operands from irreversible activities. Still, his plan bears little likeness to a cutting-edge processor model. The data path in Ressler’s work doesn’t express forward and reverse components. However, it depends on the instruction set and reversible Fredkin gates to guarantee reversibility. Later, scientists like Hall and Baker covered up issues related to reversible computing. The Pendulum, a Reversible Computer Architecture by Carlin James Vieri in 1993, was an extension of the previous work by Ressler, Hall, and Baker. The thesis assumed that bit assurance must be avoided and all the computations must be reversible logically.
But all things considered, reversible computing is not easy. Without a doubt, the design obstacles are colossal. Accomplishing proficient reversible processing with any sort of innovation will probably require an intensive update of our whole chip-plan substructure. We’ll likewise need to retrain a large piece of digital engineering to utilize the new planning philosophies. The trouble with these difficulties would be an exceptionally sorry excuse for not looking dependent upon them. Right now, we’ve shown up at a noteworthy point in the advancement of registering innovation, and we should pick away soon. If we progress forward with our current course, this would add up to abandoning the fate of figuring and tolerating that the energy productivity of our equipment will soon level. And surprisingly, a quantum-computing advancement would just serve to fundamentally accelerate a couple of particular classes of calculations, not register overall. So, supposedly, an unbounded future for processing anticipates us if we are sufficiently striking to hold onto it. 
Soham Bhattacharya is studying for his Master’s degree in Electronics and Communication Engineering from Heritage Institute of Technology, Kolkata, India.
Gain the latest insights from industry experts! Join DesignCon Digital for 2 days of education, sourcing, and networking, Oct 5 and 6, 2021. Register here.
More information about text formats
Follow us:


Frequency counter circuit

You may have already seen various projects over many websites named Frequency counter, Digital Frequency Counter etc. I’m posting just another of them. Showing the use of timer/counter of AVR micro controller (Atmega8) in one of it’s form. This circuit can be used as a simple micro controller project for your engineering courses. Frequency of a periodic signal is the number of cycles it repeats per second!   So If we count the number of cycles recorded in a second it will directly read the frequency. So what we are going to make is a frequency counter circuit, which can also be called as a frequency meter.

To make this frequency meter 1) we need a signal (whose frequency has to be counted) 2) Atmega8 micro controller from Avr 3) An LCD to display the counted frequency. I assume that you are familiar with Avr Atmega8 and you know how to program it. You also need to know – How to interface LCD with Avr

Now let’s get into the details of the project – Simple Frequency Counter or otherwise Frequency Meter!

Take a look at the circuit diagram given below and also skim through the program given towards the end of this article.

Description of circuit:-

            So what I have done here is; Set the counter to zero, waited for 1S, and read  the counter again. But remember,you need to read the value immediately after the delay loop ends. It is simple. Just assign a variable and copy the count to that. The data type of the variable is essentially an unsigned integer. You can try floating point data type too! But here you need to typecast it! That’s all!  To read about floating point conversion in Avr – read this article carefully – String Formatting of Avr

                And yes! It’s better that you apply a conditioned signal for counting the frequency. i.e. a square wave or a trail of pulses. You may use a suitable signal conditioning circuit like Comparator  Schmitt trigger, sine wave to square wave converter, whatever suits you. If the signal is of low power, then use a conditioning circuit . You can get lots of signal conditioning circuits in this website – check here – Signal Conditioner Circuits

Now here is the Technical details of my project. I hope you’ll have not much problem to make this.

Best Graphics Card For Fortnite in 2021 Benchmarks for 4k Gaming

So if you want to play the Fortnite game on Ultra HD 4k result so there are some Best Graphics Cards For Fortnite in 2021 are listed below with the benchmarks of the Fortnite Game in ultra HD 4k. So have a look at these Best Graphics Cards for Fortnite.

So Beautiful people as you all know there are many multiplayer games like PUBG cod warzone, Fortnite, GTA V Online, and many more. And everyone wants a high graphic card so they can play their game on ultra HD 4k result because in this era graphic really does matter.

In some games where graphic really does not matter but in 2020 games graphics really does matter so you have to buy an amazing Best Graphics Cards For Fortnite in 2021 where you can play your games with 4k ultra HD.

How do we pick?

Unlike other PC components where we have a wide range of companies to choose from, but here we have only two brands to consider AMD and NVIDIA if you are in a best or reference or stock graphics card. Things can be a little crazy when it comes to third-party graphics cards; however, which is where prices, stability and performance can change a lot.

We have picked up all the top models from each tier and factored in benchmark scores. Reviews from users and experts play an essential role in our decision making as well.

So people there are many graphic cards which help you to play the games on ultra HD 4k. And in Fortnite, you can play your game and enhance your gaming performance.

Best graphics card for Fortnite 2021

Whether you are looking for the most expensive and powerful graphics card or you are looking for the best budget graphics card for Fortnite, you will find the best graphics card for Fortnite that will be perfect for you when you go through this review article. Before moving to the Best GPU for Fortnitefirstlet us go through Fortnite game requirements.

Fortnite game requirement:

  • Processor: Core i5 2.8 GHz.
  • Memory: 8GB RAM.
  • Graphics card (NVIDIA): GTX 660.
  • Graphics Card (AMD): Radeon HD 7870.

Minimum system requirements

  • Processor: Corei3 2.4 GHz.
  • Memory: 4 GB RAM.
  • Graphics: Intel HD 4000.

Now move to the list of best graphics cards for Fortnite.

But one more thing only graphics card does not matter to get high graphics i.e. 4k ultra HD. There are many components like CPU, GPU, ram, PSU, motherboard which can help to compatibility with your high-end graphic cards and get you to play the game on 4k ultra HD. So here are the list that can enhance your gaming performance5 Best Graphics Cards For Fortnite in 2021

  • EVGA GeForce RTX 2080 Ti
  • GeForce RTX 2080 Super
  • MSI Gaming GeForce RTX 2080 8GB
  • ASUS Dual NVIDIA GeForce RTX 2060 Super

1.EVGA GeForce GTX 1060 SC gaming – Best Graphics Card for Fortnite

EVGA GeForce GTX 1060 SC gaming Review

Let’s start our review with the best budget graphics card for Fortnite EVGA’s GeForce 1060. A lot of experts compare it with NVIDIA GeForce Serious Models, but there are several striking differences between these two.

Let’s have a quick review of the specs of the graphics card. The graphics card is packed with 2027 MHz memory clock, 6 GB of GDDR5 memory, a set of radial fans, a single DVD-D, 3 x DP and a single HDMI 2.0 ports. In short, the card is a powerhouse that allows you to play games with much more exciting gameplay and much more significant games than Fortnite.

One of the prominent advantages of GTX 1060 is that it is very compact in size and pretty light if talking about weight. It can easily be a part of any practical tower. It mainly doesn’t feature a backplate. Moreover, the card ensures increased air flow with minimum power consumption. The optimized fan blades and highly efficient motors deliver impressive performance.

All things apart, the best part about the card is that it is priced below $500 and the best budget graphics card for Fortnite in the market.


  • 6GB RAM which is 3x more than you need.
  • An excellent option for those who are looking for a card with more Video Memory for heavier games.
  • GPU can be modified with ease


  • The cooling system is smaller than expected.
  • Expensive.

2.EVGA GeForce GTX 1070 Gaming ACX Graphic Card

EVGA GeForce GTX 1070 Review

Second, we have a GPU model, which is quite similar to the first one. We are talking about one of the famous and best graphics cards for Fortnite, EVGA’s GeForce GTX 1070 Gaming ACX graphics card. As the name shows, the card is an upgraded version of the graphics card. We just reviewed GTX 1060. These are some of the significant handy improvements you can expect.

Firstly, there’s a significant increment in the GPU frequency, and memory is also improved. Now the card holds 8GB of GDDR5, however this a lot of more than you need to play Fortnite.

You only need 2 GB to play Fortnite, so if you are planning to play massive games, then you should consider this card. The cooling problem finally got solved with this graphics card. The cooling fans are entirely and modified to ensure more headroom.

In terms of memory bus-width, it’s average, similar as used in GeForce GTX 970 and GTX 980-256 bits. But that doesn’t have much impact on the performance. Another quite exquisite technology present onboard of this graphics card is Double BIOS technology. It boasts 360-degree image capture for improved immersive experience virtually all GeForce GPUs offer.


  • GeForce GTX 1070 has a brilliant cooling system.
  • Plenty of memory can run for the Fortnite program without suffering a spike.
  • Exquisite features – adjustable RGB, 360-degree image capture, and more.


  • Very expensive.
  • Upgrading requires drivers (like DirectX).

3.NVIDIA GeForce RTX 2080 Ti ( Best 4k Graphics Card for Fortnite)

NVIDIA GeForce RTX 2080 Ti Review

If you are into 4k gaming, the NVIDIA GeForce RTX 2080 Ti is the best gaming graphics card you can buy. No need to surprise. That why this graphics card isn’t on top of the list even though this the latest graphics card from NVIDIA.

That because here are talking about Fortnite, so no worries; if you play 4k games and happen to come to this review article, this article will help you too. The card uses the company’s new turning architecture with the TU102. It means that you’ll have real-time ray tracing and advanced shading technology with Tensor Cores, and these new cores work on AI algorithms to gives your system a flashy boost.

The base edition of this graphics card comes with 11GB of GDDR6 with a speed of 14 Gbps. The interface width of the graphics card is 352 while the bandwidth clocks at 616 GB/s. The base clock speed of the card is 1,350 MHz while the OC rate is 1635 MHz NVIDIA 2080 Ti has upped CUDA cord count by around 800 to a whopping 4,352 as compared to GTX 1080 Ti.

The card uses an 8 + 8 pin setup with a TDP rating of 260w on the base edition. Cooling comes from twin dual axial 13-blade fans and a single vapor chamber. The titan is more potent than GPU in some aspects, the RTX 2080 Ti is specially built for gamers, and only this card allows you to play 4k games without sacrificing on performance.


  • Outstanding performance.
  • Tuning GPU.
  • 11GB of GDDR6.
  • Ray-tracing and Ai technology.


  • Expensive.

4.GeForce RTX 2080 Ultra gaming – Best Graphics Card for Fortnite

GeForce RTX 2080 Review

Well, everyone would love to have RTX 2080 Ti installed in their gaming machine, but in case your budget doesn’t co-operate then the GeForce RTX 2080 Ultra Gaming card is another best graphics card for Fortnite on the market you can get. Still, you can do some 4K gaming with this card. 

The card is still available in the market in stock condition, but we have decided to go for EVGA’s FTW3. You always get an 8GB of GDDR6, TU104 GPU, and still, the memory interface width is 256-bit.

This card is a newly added member of the RTX series, so you expect all goodies you’d find on other premium cards of NVIDIA. The list of premium features includes NV Link, HDCP 2.2, Ray tracing at 8 Giga Ray/s and Ansel.

Thanks to its turning architecture, you still get AI-enhanced graphics plus 2,984 CUDA cores and a maximum native resolution of 7,680 x 4,320. The card has 3 x Display Ports, a USB Type-C port, and an HDMI 2.0b port.

The size is significant because of two reasons with iCX2 cooling Tech and RGB lighting. Under the hood, there are 3 x HDB VGA fans located on this GPU, which keeps the things running cool with the help of the iCX2 cooling system. In short, the card offerings are just mind-blowing.


  • 1,860 MHz OC with 8GB of DDR6.
  • 14 power phases.
  • iCX2 Cooling Tech.
  • RGB lighting.


  • Not as smooth as 2080 Ti on 4k games.

5.MSI Gaming GeForce RTX 2060 (Best Budget Graphics Card for Fortnite)

MSI Gaming GeForce RTX 2060 Review

When talking about PC components gaming parts and high-end builds, MSI is one of the famous and leading brands. Our very first pick from MSI is MSI Gaming GeForce RTX 2060. When it comes to features, there are more similarities and fewer differences between 2060 and 2080.

This card has advantages for those who are into deep machine learning supersampling and ray tracing, although the latter goes down to 5 Giga Ray/s. the card has all those features which are found on other RTX series card, and the only few differences include the memory interface width at 192 widths, the speed at 14 Gbps and bandwidth speed at 336 GB/s.

MSI RTX 2060 can also handle up to four displays at a time like its brethren with three Display ports and an HDMI port, so you’ll only have to deal with 1 x 8-pin connector this time. The card has 6GB of GDDR6 along with the top speed of 1,710 MHz boosted.

What makes this card different from the standard edition is the cooling system. Onboard has TORX 2.0 system which utilizes a traditional and dispersion fan blade. The card has 1,200 CUDA cores, and some extra touches by MSI are like Icing on the cake.

Overall, the GeForce RTX 2060 will be a substantial upgrade for you unless you are already using one of its siblings or the pricey or popular ones like GTX 1080 Ti.


  • Great value for the price.
  • Ray tracing and DLSS.
  • Afterburner.
  • TORX fan system.


  • None.

6.AMD Radeon Vega 56 – Best Graphics Card for Fortnite

AMD Radeon Vega 56 Review

We can’t deny the fact that AMD graphics card can’t measure up to NVIDIA’s, but if you have a free-sync monitor and looking for solid graphics from a leading company, then AMD Radeon Vega 56 is a thing you are looking for.

We are talking about powerful AMD Radeon Vega 56. The well-reputed AMD Radeon Vega 56 has three 92mm fans mounted on the top to help the things to run smoother, even under heavy workloads. While those loads won’t be as strenuous as you would get with top-tier NVIDIA card. The card ranks just a hair under the 1070 Ti and is capable of 3,584 stream processors with the flashy speed memory.

AMD comes with 8GB of HBM2 memory on RX Vega 56 and has a full 248-bit interface and that’s pretty impressive. The bandwidth is 800 MHz which is equivalent to 401 GB/s. the sufficient speed is 1.6 Gbps and coming to the speed of GPU is slightly higher than stock edition 1,177 MHz to 1,478 MHz on top.

Unfortunately, 4K does not feature in this card, but luckily the card can handle QHD with ease and a maximum native resolution of 6,092 x 2,160. The board uses the 6-pin/ 8-pin configurations. The card measures 305mm x 140mm x 41.8mm, and it only requires two slots in your case. 

Considering this AMD GPU, you will get crossfire, Free Sync 2, and Radeon Chill cooling technology as well. AMD owns some higher and faster cards like Radeon VII, but they are not budget-friendly like other cards from the company.

This graphics card is a perfect balance between price and performance compared to other cards and the Vega 64; this looks like a good deal for those QHD gamers who are in search of the best graphics card for Fortnite.


  • 8 GB of HBM2.
  • Excellent price point.
  • Watt Man software.
  • Quiet and efficient.


  • Some reports of coil whine.

7.XFX Radeon RX 580 GTS (Cheap Graphics Card for Fortnite)

XFX Radeon RX 580 GTS Review

If you are in search of high–end performance graphics card, you can get under $300. This XFX Radeon RX 580 holds a sleek design and can handle incredibly high frame rates while gaming with ease. The graphics card is fitted with XFX double Dissipation cooling.

If you into 1080p 60fps then RX 580 is entirely right for you. Radeon RX 580 requires one 8 pins to connect to your power supply to your graphics card. The graphics come with Driver CD and a quick install, so you won’t face any problem with connecting it to the gaming setup.

The card has built with 8GB of DDR5 of video memory and has a dual BIOS. The excellent Chill tech ensures that the CPU doesn’t get heated and saves power and facilitates the dynamic regulation of frame rates based on your movements when playing games.

Moreover, the card has an incredible clock speed of 1366 MHz, and the minimum power consumption is 500 watts. The fun fact is the fans of GPU doesn’t move until you cross 60 degree Celsius. The graphics card has enough inputs to connect to the multiple monitors. And you won’t face any issues while adding drivers.

Another advantage is its enhanced VRM technology, which makes sure the lowest noise levels—another feature that makes this graphic card stand out from similar products in this price range in the market. Fortnite can play Fortnite flawlessly and surpass your expectations. In short, RX 580 can comfortably handle live streaming, sports games, and GPU intensive encoding at the same time.


  • Runs Fortnite well at 1920 x 1080.
  • Looks fantastic with faux carbon fiber of the shroud.
  • Easily clocks to 1425 MHz.
  • It doesn’t require any boost in power or voltage to hit overclocks.
  • The card is tranquil during operation.


  • Dimensions are huge.

8.NVIDIA GTX 1080 – Best Graphics Card for Fortnite

NVIDIA GTX 1080 Review

A year before NVIDIA’s Pascal GPU architecture was on top of the list of every PC enthusiast’s component wish list. It was the very first card which comes with a manufacturing process smaller than 28nm. Unfortunately, it doesn’t run as smoothly as the previous model of the series but stills it firmly at 82 degrees C or 91.

What interesting about this card is its Display port 1.4 connection standard, and when it comes in 4k gaming, the lack of a higher refresh rate monitor is a huge factor that is limiting. NVIDIA claims that 2 x 1080s in SLI will be able to push 4k resolutions at 144Hz. DisplayPort 1.4 also supports resolutions as high as 8k resolution of 7680 x 4320 at 60Hz refresh rate at HDR or 4k with 120Hz with HDR.

NVIDIA 1080 has the power to handle those ports. It comes fitted with 2,560 CUDA cores, 160 texture units, 8GBs of Micron’s GDDR5 VRAM, and 64 ROPs. The GPU boost clock running in at a fast and comfortable 1733 MHz. is there anything more Fortnite will need? I guess no.

There are also tons of overclocking capacity fitted in GTX 1080. It was most of that time. With 256-bits, it hits up to 10,000 MHz clock and 320GB/s bandwidth and becomes more capable of performing well at 4K and 1440p without any lack in performance. So, the NVIDIA GTX is more powerful and contains more transistors, improved and enhanced memory sets, and generally rocks the show at 4K.


  • Excellent all-round performance.
  • Makes 4K gaming viable.
  • Classy design.
  • Quiet Cooler.


  • High launch price.
  • Aftermarket cards will be better.
  • Specced for less.

9.MSI Gaming GeForce GTX 1080 8GB GDDR5

MSI Gaming GeForce GTX 1080 Review

One of the best graphics card which is chosen by 20 million gamers around the globe. It was voted as the best graphics card for the year 2016. And still, it carries the same energy and robust performance capability to handle massive games like Fortnite, PUBG etc. The card gives you both a fantastic performance and outstanding value for the price. The thermal potential of the GPU is quite surprising. This big metal plates are included for extra toughness and finished looks.

With this card under the hood, Fortnite will automatically jump on to its higher settings and you will observe a definite change in the performance. You will feel like you are playing a whole different game now.

GTX 1080 is a fantastic companion and stays very chill as it is fitted with two fans. It has 8GB of DDR5 RAM and comes in a Twin Frozr VI thermal design. The design of the fan generates 22% more pressure for extremely silent performance. What makes it silent is its double ball bearing which is fitted in fans gives strong support for smooth gaming. The overheating is covered by airflow control technology.

The GPU also features RGB lighting to provide a fancy look to your gaming setup. Unlike other graphics card which takes up to 3 expansion slots of space, MSI GTX 1080 takes only 2 expansion slots. This prevents unnecessary blockage of space.

GTX 1080 is a still and advanced level graphics card that provides you the ultimate gaming platform and is the best GPU for Fortnite. It also supports multi GPU technologies. Moreover, VR technology ensures you get the perfect level of performance and image quality and gives your gaming experience rides full of joy.


  • The performance is quite amazing.
  • Build quality.
  • Beautiful design.
  • Reference board layout.


  • Expensive.

10.Gigabyte GeForce GTX 1050 Ti OC 4GB GDDR5 128 bit PCI-E – Cheapest Graphic Card

Gigabyte GeForce GTX 1050 4GB GDDR5 128 Review

Another reliable and one of the best budget graphics card for Fortnite that can handle pretty much of gaming. GTX 1050 a bit weaker than most of the graphics we have picked above. It is only 4 Gigabytes, but still, it is more than what you need.

This GPU is specially made for gaming enthusiasts and has one of the most advanced builds and architecture. It has a custom-designed fan cooler that prevents the CPU from overheating and the fan is built to keep the noise and speed at a low level. It features Wind Force 2X with Blade Fan design and helps in effective heat dissipation through the blades for maximum airflow to keep the system cool.

The GTX 1050Ti is a card which you can customize according to your need. Now you can play Fortnite in silent mode, gaming mode, user mode and OC mode. GPU is perfect for both higher and lower clock speeds and it lets you decide that what’s best for playing Fortnite.

RTX 1050Ti has 4GB of RAM integrated with GDDR5 128 bit video memory. This card also supports multi-screening, you can connect 3 HDMI monitors to this card simultaneously. Plus, you can compare 1 x DP, 3 HDMI. It comes with a black plate that offers rigidity and protection to the architecture.

The card is best for those who are looking for midrange GPU. It works at 1870 MHz core and 3504 MHz in terms of memory. You can play Fortnite with ultra-high settings flawlessly. With 16.7M customizable color options and fascinating lighting effects, you can choose for your CPU which vibes to follow. In a nutshell, the GPU is well-engineered with high-class chokes and capacitors. It delivers outstanding performance and a durable system lifespan.


  • Fast, smooth, and efficient gaming.
  • Support for the latest DirectX 12 features.
  • Improved performance and fantastic offerings.


  • Fans might be noisy.

11.ASUS ROG Strix GeForce RTX 2080Ti overclocked

ASUS ROG Strix GeForce RTX 2080Ti Review

How can we not mention ASUS? In other words, this card is a beast faster and runs cool. The performance of this graphic card is real and outstanding and make Fortnite as authentic it can be.

The Super alloy power II includes premium alloy chokes, solid polymer capacitors and a wide array of high current power stages. 0db tech allows fans to start working when the temperature is above 55 degrees C.

RTX 2080 Ti can support up to 4 monitors with display port 1.4, HDMI 2.0 and a VR Headset through USB Type-C ports. It has a new cutting edge shroud for multi-card configurations and small chassis.

With this card, you can take 4k gaming to another level, Aura Sync RGB lighting is capable of handling six lighting effects and shows millions of colors to give your gaming setup an attractive look. Overclocking is comfortable with this GPU. What’s more impressive is Tweak II which makes monitoring performance and streaming easier.

The GPU has 11 GB of RAM, and each card includes triple axial tech fans with maximum contact technology for increased cooling. Strix RTX 2080 is expensive, but I assure you that every single is worth it.


  • Great overclocking.
  • Runs cool.
  • RGB is great.
  • Very efficient.


  • Best but expensive.

Best Graphics card for Fortnite Buyer’s guide

If you are a beginner and still have a lot of questions in your mind, then go through this buyer’s guide and we are sure that all of your questions will be answered about what you should look for when buying the best graphics card for Fortnite.

You have to gather information from different parts. We have already done that for you all in one place. All you have to do is just to relax and go through the buyer’s guide.

These are some following specifications that you should keep in mind before buying the best graphics card for Fortnite.

  • Thermal Design Power

First things first, let me tell you that power supply is sufficient for the GPU you want to install and it is essential to acknowledge that which kind of power supply the GPU requires. Look at the requirements and then make a particular situation.

  • Video Memory

As we told you, Fortnite requires only 2GB of VRAM to get started with the game. It means that every GPU listed will work flawlessly. However, the closer you get the riskier it becomes for you to play on ultra settings. We would you to get a 6Gb or above which is already an overkill but the good thing is with 6GB you can play other massive games as well.

  • Size

Another important factor is size. Some people have smaller cases, so for them, larger GPUs don’t work. In the latter case, you are safe; you don’t have to worry about the size since you’re more likely to have enough space for 4 GPUs. If talking about the former case, you’ll need to measure the case dimensions and then get a card according to that size.

  • Monitor

There are chances that you end up buying top-spec PC but still ended up unable to play Fortnite. Your old monitor might be a reason for that or maybe “black-screens.” They can’t support specific resolutions. So make sure to get a card that matches your monitor.

  • Connections

Also, make sure that the card you are going to buy has enough ports to support all the monitors you will be plugging into your PC.


Buying the best GPU for Fortnite is a thing you would be doing every day. So be careful while buying one. Before buying, do some researches and then decide what is best for you and get all of your questions to get answered. To save your time we have written this article so that you get the best of all. 

Every card forehead mentioned is perfectly capable of running Fortnite on ultra-settings, so you are just one click away from getting the best graphics card for Fortnite delivered at your doorstep.

Battery charger circuit using SCR

A simple battery charger based on SCR is shown here.Here the SCR rectifies the AC mains voltage to charge the battery.When the battery connected to the charger gets discharged the battery voltage gets dropped.This inhibits the forward biasing voltage from reaching the base of the transistor Q1 through R4 and D2.This switches off the transistor.When the transistor is turned OFF,the gate of SCR (H1) gets the triggering voltage via R1 & D3.This makes the SCR to conduct and it starts to rectify the AC input voltage.The rectified voltage is given to the battery through the resistor R6(5W).This starts charging of the  battery.
When the battery is completely charged the base of Q1 gets the forward bias signal through the voltage divider circuit made of R3,R4,R5 and D2.This turns the transistor ON.When the Q1 is turned ON the trigger voltage at the gate of SCR is cut off and the SCR is turned OFF.In this condition a very small amount of charge reaches the battery via R2 and D4 for trickle charging.Since the charging voltage is only half wave rectified ,this type of charger is suitable only for slow charging.For fast charging full wave rectified charging voltage is needed.

Circuit diagram with Parts list.


  • Assemble the circuit on a good quality PCB or common board.
  • The transformer T1 can be 230V primary, 18V /3A secondary step down transformer.
  • The voltage of the battery at which the charging should stop can be set by  the POT R4.
  • The battery can be connected to the charger circuit by using crocodile clips.

AMD’s Ryzen 9 5950X Shatters PassMark Records

If the Ryzen 5 5600X left you impressed, the Ryzen 9 5950X will blow you out of your chair. The 16-core monster has catapulted its way to the top of the mainstream processor chart in both single-and multi-thread performance.

The Ryzen 9 5950X scores are no longer available, but German publication Computer Base managed to grab screenshots of the Zen 3 flagship’s results before they were erased. Given the circumstances, it’s uncertain if the Ryzen 9 5950X was overclocked or whether it was paired with memory that surpass the official supported DDR4-3200 specification. While we wait on the full review, we recommend you take the PassMark scores with a pinch of salt.

While AMD has been injecting more cores in mainstream chips for a while now, the chipmaker’s offerings aren’t quite up to par with Intel’s parts when it comes to single-thread performance. If these PassMark numbers are accurate, it would appear that Zen 3 has finally tipped the scales in AMD’s favor.


AMD Ryzen 9 5950X Benchmarks

Processor PassMark Single-Thread Score PassMark Multi-Thread Score
Ryzen 9 5950X 3,693 45,563
Ryzen 5 5600X 3,495 22,824
Core i9-10900K 3,176 24,261
Ryzen 9 3950X 2,747 39,277
Core i9-10980XE N/A 34,138

The Ryzen 9 5950X is reportedly up to 34.4% faster than the Ryzen 9 3950X in single-thread performance and up to 16% in multi-thread performance. For reference, the Ryzen 9 5950X comes with a 3.4 GHz base clock and 4.9 GHz boost clock, while the Ryzen 9 3950X has a 3.5GHz base clock and 4.7 GHz. It was to expected that the Ryzen 9 5950X would be superior chip.

In comparison to its Intel rival, the Ryzen 9 5950X seemingly delivers up to 16.3% higher single-thread performance than the Core i9-10900K. Now, you have to remember that the Core i9-10900K features a 3.7 GHz base clock and a whopping 5.3 GHz boost clock. We’re not underestimating Zen 3, but it’s a bit hard to swallow that the AMD chip with a 400 MHz boost clock would outperform the Core i9-10900K. For now, we’ll have to trust PassMark’s metrics until we get the chip in our lab for thorough testing.

Possessing substantially more cores, the Ryzen 9 5950X’s multi-thread performance doesn’t raise any eyebrows. The 16-core processor purportedly offers up to 87.8% higher multi-thread performance than the Core i9-10900K. Intel doesn’t offer more than 10 cores on its mainstream processors so the Core i9-10980XE, which is a HEDT (high-end desktop) SKU, will have to be the point of comparison for multi-thread performance. Despite being at a two-core disadvantage, the Ryzen 9 5950X’s multi-thread performance is apparently faster than the Core i9-10980XE by up to 33.5%.

If Ryzen 5000 (Vermeer) can deliver, Intel could be in big trouble since it’s Comet Lake-S army will likely not be able to fend of Zen 3. It looks as though even Intel’s upcoming Rocket Lake processors might not be enough.