Maximizing economic performance in shale requires optimal selection of well and cluster spacing, among other parameters. Reservoir engineering calculations can be used to optimize spacing, but these calculations are impacted by uncertainties in input parameters. System permeability is particularly important and difficult to measure. Diagnostic Fracture Injection Tests (DFIT's) are often used to estimate permeability because they provide a direct, in-situ measurement. However, in recent work, it was shown that conventional DFIT interpretation techniques can overestimate permeability in gas shale by two orders of magnitude. In this study, the impact of the permeability estimate is demonstrated using a dataset from the Utica/Point Pleasant. Production data is history matched with models assuming high and low permeability. It is possible to history match both models because of non-uniqueness between fracture area and permeability. Sensitivity analysis simulations are performed to assess the impact of well and cluster spacing on net present value. Relative to the high permeability model, the low permeability model has a greater optimal well spacing and a tighter optimal cluster spacing. The comparison shows that improved accuracy in the permeability estimate significantly improves economic performance. The low permeability model has much earlier production interference than the high permeability model because the low permeability model requires greater effective fracture length to match production. This is consistent with the operator's experience that outer wells outproduce inner wells within weeks or months from the start of production.