By J. McManus The recent advancements in AI promise huge developments in the educational field. Generative AI has become more accessible to everyone all around the world, but what does this mean for us?
The recent uprise in the use of artificial intelligence has led to increased plagiarism and academic dishonesty. For example, teachers now put students’ texts through an AI identifier to determine whether it has been produced by an AI tool. Identification of the use of AI may not be easy in other areas such as solving maths equations, since it would be undetectable. Generative AI such as ChatGPT and Gemini have been known to give inaccurate results, called hallucinations (e.g., wrong book quotes, incorrect maths equations, and more). Although artificial intelligence can easily be abused and may not always be errorless, it also serves as a great tool for study and assisting teachers. It provides personalized and specific solutions to help you and your needs, also providing information in a way that most will understand. Use of such a powerful tool calls for a set of rules to be put in place. It is imperative that people are informed about the issues it may cause. The IB website compares this kind of AI technology to a calculator or a translator. It is strictly forbidden without citation of its use; however, it is not prohibited and can be used as a helpful study tool when researching context or if you wish to familiarize with a new topic. Since the artificial intelligence we have access to today is still learning and developing, it may take some time for it to fully flourish into an evolved instrument that could be single handedly teaching complex classes to children. Although we are unsure of what the future holds, if AI continues emerging and prospering, in a few decades time, there might be classes of students learning directly with robots and receiving tailored aid and feedback to their own personal needs.
0 Comments
By B. Sapoznik In a world where everything seeks our attention, it is extremely difficult to be able to filter all the received information into knowledge, especially with the surrounding reports of fake news. Amidst this process of filtration, it is even harder to keep up with the new advancements made in the world, which are mixed in different conspiracies, lies, and other chaotic information we are exposed to every day. If you haven’t noticed, 2024 has set a great stage for new game-changing technologies to be released, and so our present point in 2024 gives us enough hindsight to be able to reflect: what are some of the technologies which have truly been incorporated throughout this past year?
In February 2024, Apple released the Apple Vision Pro, which has been an invention long speculated on the technological market. The Vision Pro is a virtual reality headset, which features cutting edge technology and incredible eye/hand tracking sensors. In relation to the hardware used, the Vision Pro contains a 23-million-pixel display, M2/R1 chips, head-tracking spatial sound systems, and others. As much as the invention has been shocking for Apple consumers, the product has reportedly been unattractive for users, with around 300,000 units sold – a shocking low number for a product so desperately speculated in the market. Even though the VR headset hasn’t been much of a success, it has most certainly been one of the major technologies of this year. On the other hand, the past year has seen a significant growth amount in the market of biotechnology. For instance, the AlphaFold 3 software has allowed for the facilitation of protein formation prediction by research and model-creation through generative AI. This was enabled through Google’s DeepMind group and has been the Chemistry Nobel Prize winning software of 2024 after being released in May 2024. With this technology, the prediction and modelling of protein structures is extremely enhanced, since it facilitates the process for studies and makes biological studies more efficient. Furthermore, in relation to AI, 2024 has probably been the year with the largest breakthroughs in AI models. From image and text generation to medical diagnostics to database analysis, the AI industry has grown exponentially in this past year. This growth has happened in such a large scale that researchers estimate the AI market to reach U$D 305.9 billion by the end of this year. Yet of course, we have all been ’guinea-pigs’ for their tests, and there have been thousands of reports on faulty versions of AIs from various companies in an attempt to enhance such capabilities. Although, we have now reached an era of relative stability, and it is easy to picture how such technologies can expand in future years, as we see companies such as Google, Meta, Apple, Microsoft, and others fully immerse into the market. As we observe this rise, it is important to note the dangers that the human society might be exposed to, such as job loss and AI control in later years. Also, the expansive AI industry has been extremely relevant for the theme of quantum computing. This is an area of computing which uses several behaviors of microscopic and sub-atomic particles to improve ways of increasing efficiency in comparison to normal computers through calculation and data models embedded on such particles. In 2024, quantum computer research has led it to become less error-prone, and scientists view capacities of AI working alongside quantum computing as a positive, serving to amplify the capabilities of both fields in more productive and efficient manners. Certainly, this is another sector which could be interesting to accompany in 2025 and in later years due to its predicted expansion. In general, these have been the main sectors highlighted in innovative technologies until October 2024. There are many more industries and technologies which have also been developed, such as microchip and energy industries, though these are both sectors which are expected to breakthrough in future years, as opposed to the sections mentioned in the article which have stood out specifically this year. The technologies mentioned in this article have certainly been revolutionary in this past year, and will serve as a suitable base for future technological advancements and assistance in many areas of studies, work, and lifestyle as a whole. Bibliography:
By R. Renzo CRISPR, a name frequently mentioned in the field of genetics, has been slowly proving itself as a groundbreaking tool in genetic editing. But what exactly is it?
Although CRISPR is used to refer broadly to the systems used to target and edit DNA, the term stands for "Clustered Regularly Interspaced Short Palindromic Repeat". This makes reference to repeated repetitive DNA sequences in bacteria with "spacer" DNA sequences. When a virus infects a bacterium, the bacterium can capture and store small segments of the virus’s DNA, known as "spacers". In the natural genome (complete set of genetic material in a cell) editing processes inside bacterial cells, the cells transcribe the "spacer" sequences to RNA upon viral infection. RNA then has the capability of locating DNA sequences that match the "spacer". When the target material is found, one of the enzymes produced by the CRISPR system binds to the DNA and cuts it off, shutting off the gene. In a groundbreaking way, researchers have found ways to use CRISPR to enable gene activation, and not its deletion. With these systems, we are now able to permanently modify genes in living cells and organisms, and possibly even make it possible to correct mutations at precise locations in the human genome. As this technology continues to refine itself, the possibilities are endless. It's primary impact area seems to be in treating human diseases with a known mutation in the gene of cells. A rising treatment is that of sickle cell anaemia, a genetic blood disorder that affects the shape of red blood cells, making them look like sickles. These misshaped cells clog critical blood vessels, causing obstructions that can lead to severe pain, infections or even strokes. Treatment options for patients who suffer from sickle cell anaemia are currently limited. Patients require frequent blood transfusions, and their medications typically address only the symptoms that arise because of the disease, but not the root of it. Very recently, in December of 2023, the FDA approved the very first CRISPR-based therapy, where the main approach is to correct the haemoglobin gene mutation responsible for sickle cell trait, causing the healthy form of the gene to be produced. Currently, clinical trials are taking place in order for this treatment to be put into action. In addition to being able to edit the cells that make up most of the body, it is possible to edit the genetic material of gametes (eggs and sperm cells) as well as early embryos. This is called germline editing, and any such edits would affect the affected person's genetic makeup, as well as the ones that inherit the DNA. However, there are some ethical issues that arise with such a powerful tool – this technology could be used to theoretically improve the desirable traits instead of curing disease. For now, scientists have put aside germline editing because of its ethical and societal implications, but this may be something implemented in the future. Imagine being able to choose your own child's eye colour, height and even intelligence! CRISPR has already changed the game for the treatment of disease, but can it change the cosmetic industry as a whole by introducing customizability to foetuses? Will we ever live in a society where this is not stigmatized, and even turns normal? Bibliography
By G. Maranhão Have you ever seen a phone book in your life? I am sure most of you have not, since they began dwindling out before the oldest of the current students in St. Paul’s started learning how to read. The telephone book, first used in the 19th century and greatly valued as a source of information, was deemed useless and 'wasteful' just years after the modern smartphone was released. After all, when you have a super advanced, touch-screen mobile phone in your pocket, with access to the internet and capability of remembering all those phone numbers, why would you need a thick book full of names and numbers? Not only that, but why would people need GPS, wristwatches, MP3 players, calculators, flashlights, digital cameras, or many others important gadgets? Suddenly, in 2007, the world was turned upside down by a device that was smaller than a postcard.
However, the concept of a smartphone began more than a decade before the iPhone’s release. Professionals first started idealizing the smartphone in 1994, when IBM’s Simon, the first mobile device combining a phone with touchscreen features like email and a calendar, was released. Inspired by this, brands like Nokia, BlackBerry, and Palm advanced mobile technology by adding email, basic web browsing, and keyboards for messaging throughout the 1990s to early 2000s. Even so, phones were a rarity then and not useful for much other than messaging and making calls. Subsequently, as we all know, came the true breakthrough when Apple launched the iPhone in 2007, equipped with a touch-focused design and app ecosystem that redefined the industry. Have you ever considered how drastically things changed in such a short time following the iPhone’s release? This tiny device, developed in secrecy for years, sparked a transformation in the world that no one could ever have anticipated. Smartphones could provide a wide range of functions that mobile phones at the time simply couldn't, like internet access, GPS navigation, high-quality photography and videography, mobile payment options, and so much more. According to Statista, in 2015, 8 years after the iPhone’s release, the global median for smartphone ownership was 43%. As of 2024, it is estimated to be more than 4.9 billion people, around 60% of the population. Considering that only 74% of people in the world have access to safe drinking water, that statistic does seem astronomically high, does it not? And yes, I understand that many people today only discuss smartphones to lament how they’ve allegedly ruined humanity and robbed younger generations of their childhood. There’s certainly some truth to this; for instance, the average screen time in Brazil is 5 hours and 17 minutes — ranking second highest in the world — which adds up to about 80 days a year. Nevertheless, as someone who has sat through countless assemblies about how phones are supposedly degrading our brains, I prefer to take a more balanced view. Dr. Amy Orben, a psychologist at the University of Cambridge, provides valuable insight into this debate. She notes that “it’s a very human trait to worry about new technologies,” and that this apprehension is part of a broader, consistent pattern. Orben argues that the fears surrounding smartphones as a singular entity are “overblown,” asserting that “phones are tools.” Ultimately, she highlights that “like all other technologies, their impact really depends on who uses them and how. I am very much inclined to believe that. Civilisation has shown its constant anxiety about novelty for centuries now, be it when trains were first invented, the printing press began, or even the new artificial intelligence surge. It appears that humanity is influenced to believe that it is under perpetual threat in any circumstance and that anything new will doom us entirely. The smartphone was created as nothing more than a tool: a tool for entertainment, utility, and everyday use. It is undeniable how much smartphones have helped humanity. Their impact can vary significantly, depending on the user — whether positively or negatively. Discussions about this topic should not focus on criticizing the smartphone and categorizing it as the worst thing to ever happen to humanity, but on how to efficiently and appropriately engage with smartphones. In conclusion, instead of shunning this powerful tool, we should embrace the responsibility that comes with it. Everybody knows that the smartphone is a double-edged sword; while it presents challenges, it also opens up endless opportunities. Rather than viewing it solely as a detriment to society, we should aim to understand its complexities and impact, honoring the original purpose that was intended by those developers decades ago. By encouraging responsible usage, we can appropriately utilize the power of smartphones without wasting away our lives, ultimately paving the way for a better society. Bibliography
By C. Moura It all started when Katalin Karikó went to the US and met Drew Weissman. When Karikó met him, he was creating a formula for the HIV vaccine. Meanwhile, she told him to use the method she was researching that still wasn't very successful. But, she believed that this message could open new doors for medicine. After searching for a long time, they finally knew what they needed to do to find the COVID-19 formula. For the solution, they needed the DNA. Then the RNA test started, who orders the DNA test as a messenger. Then the RNA goes to the cell to and 'tells' her to protect herself from the virus. Interestingly, they didn’t even needed the virus in hand, just it’s genetic sequence. Also, did you know that the technique discovered by Karikó and Weissman speeds up a lot of development of a vaccine, and could in the future help in combating other infections, disease, and certain types of cancer? |
Categories
All
Archives
December 2024
|