Problem: The starch source in a beer provides the fermentable material and is a key determinant of the strength and flavour of the beer. The most common starch source used in beer is malted grain. Grain is malted by soaking it in water, allowing it to begin germination, and then drying the partially germinated grain in a kiln. Malting grain produces enzymes that convert starches in the grain into fermentable sugars. Different roasting times and temperatures are used to produce different colours of malt from the same grain. Darker malts will produce darker beers.
What does a grains starch become after an is malted?
The answer is the following: fermentable sugars


The staple foods were generally consumed around 11 o'clock, and consisted of bread, lettuce, cheese, fruits, nuts, and cold meat left over from the dinner the night before.[citation needed] The Roman poet Horace mentions another Roman favorite, the olive, in reference to his own diet, which he describes as very simple: "As for me, olives, endives, and smooth mallows provide sustenance." The family ate together, sitting on stools around a table. Fingers were used to eat solid foods and spoons were used for soups.[citation needed]
What type of food was cheese considered to be in Rome?
staple


Input: Antenna (radio)
Although a resonant antenna has a purely resistive feed-point impedance at a particular frequency, many (if not most) applications require using an antenna over a range of frequencies. An antenna's bandwidth specifies the range of frequencies over which its performance does not suffer due to a poor impedance match. Also in the case of a Yagi-Uda array, the use of the antenna very far away from its design frequency reduces the antenna's directivity, thus reducing the usable bandwidth regardless of impedance matching.

What causes the frequencies outside of the bandwidth to be unusable?
Output: impedance match


Input: Article: This boom in innovative financial products went hand in hand with more complexity. It multiplied the number of actors connected to a single mortgage (including mortgage brokers, specialized originators, the securitizers and their due diligence firms, managing agents and trading desks, and finally investors, insurances and providers of repo funding). With increasing distance from the underlying asset these actors relied more and more on indirect information (including FICO scores on creditworthiness, appraisals and due diligence checks by third party organizations, and most importantly the computer models of rating agencies and risk management desks). Instead of spreading risk this provided the ground for fraudulent acts, misjudgments and finally market collapse. In 2005 a group of computer scientists built a computational model for the mechanism of biased ratings produced by rating agencies, which turned out to be adequate to what actually happened in 2006–2008.[citation needed]

Now answer this question: In what year did a group of computer scientists build a model for ratings produced by rating agencies that turned out to be accurate for what happened in 2006-2008?

Output: 2005


Article: In recent years, the city has experienced steady population growth, and has been faced with the issue of accommodating more residents. In 2006, after growing by 4,000 citizens per year for the previous 16 years, regional planners expected the population of Seattle to grow by 200,000 people by 2040. However, former mayor Greg Nickels supported plans that would increase the population by 60%, or 350,000 people, by 2040 and worked on ways to accommodate this growth while keeping Seattle's single-family housing zoning laws. The Seattle City Council later voted to relax height limits on buildings in the greater part of Downtown, partly with the aim to increase residential density in the city centre. As a sign of increasing inner-city growth, the downtown population crested to over 60,000 in 2009, up 77% since 1990.

Question: What change in building heights did Seattle make to increase population density in its downtown ?
Ans: height limits


Input: Printed circuit board
In boundary scan testing, test circuits integrated into various ICs on the board form temporary connections between the PCB traces to test that the ICs are mounted correctly. Boundary scan testing requires that all the ICs to be tested use a standard test configuration procedure, the most common one being the Joint Test Action Group (JTAG) standard. The JTAG test architecture provides a means to test interconnects between integrated circuits on a board without using physical test probes. JTAG tool vendors provide various types of stimulus and sophisticated algorithms, not only to detect the failing nets, but also to isolate the faults to specific nets, devices, and pins.

To whom would you go to acquire the algorithms you'd use for the Joint Test Action Group procedures?
Output:
JTAG tool vendors